Home > Research > Publications & Outputs > Generating feature spaces for linear algorithms...

Links

Text available via DOI:

View graph of relations

Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis. / Böhmer, Wendelin; Grunewalder, Steffen; Nickisch, Hannes et al.
In: Machine Learning, Vol. 89, No. 1, 10.2012, p. 67-86.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Böhmer W, Grunewalder S, Nickisch H, Obermayer K. Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis. Machine Learning. 2012 Oct;89(1):67-86. Epub 2012 Jun 13. doi: 10.1007/s10994-012-5300-0

Author

Böhmer, Wendelin ; Grunewalder, Steffen ; Nickisch, Hannes et al. / Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis. In: Machine Learning. 2012 ; Vol. 89, No. 1. pp. 67-86.

Bibtex

@article{81086276c7bc4ec9b00752b80f4866d1,
title = "Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis",
abstract = "Without non-linear basis functions many problems can not be solved by linear algorithms. This article proposes a method to automatically construct such basis functions with slow feature analysis (SFA). Non-linear optimization of this unsupervised learning method generates an orthogonal basis on the unknown latent space for a given time series. In contrast to methods like PCA, SFA is thus well suited for techniques that make direct use of the latent space. Real-world time series can be complex, and current SFA algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to develop a kernelized SFA algorithm which provides a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. We hypothesize that our algorithm generates a feature space that resembles a Fourier basis in the unknown space of latent variables underlying a given real-world time series. We evaluate this hypothesis at the example of a vowel classification task in comparison to sparse kernel PCA. Our results show excellent classification accuracy and demonstrate the superiority of kernel SFA over kernel PCA in encoding latent variables.",
keywords = "Time series, Latent variables, Unsupervised learning, Slow feature analysis, Sparse kernel methods, Linear classification",
author = "Wendelin B{\"o}hmer and Steffen Grunewalder and Hannes Nickisch and Klaus Obermayer",
year = "2012",
month = oct,
doi = "10.1007/s10994-012-5300-0",
language = "English",
volume = "89",
pages = "67--86",
journal = "Machine Learning",
issn = "1573-0565",
publisher = "Springer Netherlands",
number = "1",

}

RIS

TY - JOUR

T1 - Generating feature spaces for linear algorithms with regularized sparse kernel slow feature analysis

AU - Böhmer, Wendelin

AU - Grunewalder, Steffen

AU - Nickisch, Hannes

AU - Obermayer, Klaus

PY - 2012/10

Y1 - 2012/10

N2 - Without non-linear basis functions many problems can not be solved by linear algorithms. This article proposes a method to automatically construct such basis functions with slow feature analysis (SFA). Non-linear optimization of this unsupervised learning method generates an orthogonal basis on the unknown latent space for a given time series. In contrast to methods like PCA, SFA is thus well suited for techniques that make direct use of the latent space. Real-world time series can be complex, and current SFA algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to develop a kernelized SFA algorithm which provides a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. We hypothesize that our algorithm generates a feature space that resembles a Fourier basis in the unknown space of latent variables underlying a given real-world time series. We evaluate this hypothesis at the example of a vowel classification task in comparison to sparse kernel PCA. Our results show excellent classification accuracy and demonstrate the superiority of kernel SFA over kernel PCA in encoding latent variables.

AB - Without non-linear basis functions many problems can not be solved by linear algorithms. This article proposes a method to automatically construct such basis functions with slow feature analysis (SFA). Non-linear optimization of this unsupervised learning method generates an orthogonal basis on the unknown latent space for a given time series. In contrast to methods like PCA, SFA is thus well suited for techniques that make direct use of the latent space. Real-world time series can be complex, and current SFA algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to develop a kernelized SFA algorithm which provides a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. We hypothesize that our algorithm generates a feature space that resembles a Fourier basis in the unknown space of latent variables underlying a given real-world time series. We evaluate this hypothesis at the example of a vowel classification task in comparison to sparse kernel PCA. Our results show excellent classification accuracy and demonstrate the superiority of kernel SFA over kernel PCA in encoding latent variables.

KW - Time series

KW - Latent variables

KW - Unsupervised learning

KW - Slow feature analysis

KW - Sparse kernel methods

KW - Linear classification

U2 - 10.1007/s10994-012-5300-0

DO - 10.1007/s10994-012-5300-0

M3 - Journal article

VL - 89

SP - 67

EP - 86

JO - Machine Learning

JF - Machine Learning

SN - 1573-0565

IS - 1

ER -