Open Research Newcastle
Browse

GDTW-P-SVMs: variable-length time series analysis using support vector machines

Download (557.2 kB)
journal contribution
posted on 2025-05-09, 08:42 authored by Arash Jalalian, Stephan ChalupStephan Chalup
We describe a new technique for sequential data analysis, called GDTW-P-SVMs. It is a maximum margin method for the construction of classifiers with variable-length input series. It employs potential support vector machines (P-SVMs) and Gaussian Dynamic Time Warping (GDTW) to waive the fixed-length restriction of feature vectors in training and test data. As a result, GDTW-P-SVMs enjoy the P-SVM method's properties such as the ability to: (i) handle data and kernel matrices that are neither positive definite nor square and (ii) minimise a scale-invariant capacity measure. The new technique elaborates on the P-SVM kernel functions, by utilising the well-known dynamic time warping algorithm to provide an elastic distance measure for the kernel functions. Benchmarks for classification are performed with several real-world data sets from the UCR time series classification/clustering page, the GeoLife trajectory data set, and the UCI Machine Learning Repository. The data sets include data with both variable and fixed-length input series. The results show that the new method performs significantly better than the benchmarked standard classification methods.

Funding

ARC

DP1092679

History

Related Materials

Journal title

Neurocomputing

Volume

99

Issue

1

Pagination

270-282

Publisher

Elsevier

Language

  • en, English

College/Research Centre

Faculty of Engineering and Built Environment

School

School of Electrical Engineering and Computer Science

Usage metrics

    Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC