Home > Research > Publications & Outputs > Regularized sparse kernel slow feature analysis

Links

Text available via DOI:

View graph of relations

Regularized sparse kernel slow feature analysis

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Regularized sparse kernel slow feature analysis. / Böhmer, W.; Grunewalder, S.; Nickisch, H. et al.
Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011. ed. / D. Gunopulos; T. Hoffmann; D. Malerba; M. Vazirgiannis. Berlin: Springer, 2011. p. 235-248 (Lecture Notes in Computer Science; Vol. 6911).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Böhmer, W, Grunewalder, S, Nickisch, H & Obermayer, K 2011, Regularized sparse kernel slow feature analysis. in D Gunopulos, T Hoffmann, D Malerba & M Vazirgiannis (eds), Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011. Lecture Notes in Computer Science, vol. 6911, Springer, Berlin, pp. 235-248. https://doi.org/10.1007/978-3-642-23780-5_25

APA

Böhmer, W., Grunewalder, S., Nickisch, H., & Obermayer, K. (2011). Regularized sparse kernel slow feature analysis. In D. Gunopulos, T. Hoffmann, D. Malerba, & M. Vazirgiannis (Eds.), Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011 (pp. 235-248). (Lecture Notes in Computer Science; Vol. 6911). Springer. https://doi.org/10.1007/978-3-642-23780-5_25

Vancouver

Böhmer W, Grunewalder S, Nickisch H, Obermayer K. Regularized sparse kernel slow feature analysis. In Gunopulos D, Hoffmann T, Malerba D, Vazirgiannis M, editors, Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011. Berlin: Springer. 2011. p. 235-248. (Lecture Notes in Computer Science). doi: 10.1007/978-3-642-23780-5_25

Author

Böhmer, W. ; Grunewalder, S. ; Nickisch, H. et al. / Regularized sparse kernel slow feature analysis. Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011. editor / D. Gunopulos ; T. Hoffmann ; D. Malerba ; M. Vazirgiannis. Berlin : Springer, 2011. pp. 235-248 (Lecture Notes in Computer Science).

Bibtex

@inproceedings{e277019582314bddb6933316ceb2730d,
title = "Regularized sparse kernel slow feature analysis",
abstract = "This paper develops a kernelized slow feature analysis (SFA) algorithm. SFA is an unsupervised learning method to extract features which encode latent variables from time series. Generative relationships are usually complex, and current algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to provide a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small but complex data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. Versatility and performance of our method are demonstrated on audio and video data sets.",
author = "W. B{\"o}hmer and S. Grunewalder and H. Nickisch and K. Obermayer",
year = "2011",
doi = "10.1007/978-3-642-23780-5_25",
language = "English",
isbn = "9783642237799",
series = "Lecture Notes in Computer Science",
publisher = "Springer",
pages = "235--248",
editor = "D. Gunopulos and Hoffmann, {T. } and D. Malerba and M. Vazirgiannis",
booktitle = "Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011",

}

RIS

TY - GEN

T1 - Regularized sparse kernel slow feature analysis

AU - Böhmer, W.

AU - Grunewalder, S.

AU - Nickisch, H.

AU - Obermayer, K.

PY - 2011

Y1 - 2011

N2 - This paper develops a kernelized slow feature analysis (SFA) algorithm. SFA is an unsupervised learning method to extract features which encode latent variables from time series. Generative relationships are usually complex, and current algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to provide a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small but complex data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. Versatility and performance of our method are demonstrated on audio and video data sets.

AB - This paper develops a kernelized slow feature analysis (SFA) algorithm. SFA is an unsupervised learning method to extract features which encode latent variables from time series. Generative relationships are usually complex, and current algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to provide a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small but complex data sets, however, the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. Versatility and performance of our method are demonstrated on audio and video data sets.

U2 - 10.1007/978-3-642-23780-5_25

DO - 10.1007/978-3-642-23780-5_25

M3 - Conference contribution/Paper

SN - 9783642237799

T3 - Lecture Notes in Computer Science

SP - 235

EP - 248

BT - Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2011

A2 - Gunopulos, D.

A2 - Hoffmann, T.

A2 - Malerba, D.

A2 - Vazirgiannis, M.

PB - Springer

CY - Berlin

ER -