Home > Research > Publications & Outputs > Feature selection for time series prediction - ...

Electronic data

Links

Text available via DOI:

View graph of relations

Feature selection for time series prediction - A combined filter and wrapper approach for neural networks

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Feature selection for time series prediction - A combined filter and wrapper approach for neural networks. / Crone, Sven F.; Kourentzes, Nikolaos.
In: Neurocomputing, Vol. 73, No. 10-12, 06.2010, p. 1923-1936.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Author

Bibtex

@article{6c4b109e2a4346508621ee557f4b40a9,
title = "Feature selection for time series prediction - A combined filter and wrapper approach for neural networks",
abstract = "Modelling artificial neural networks for accurate time series prediction poses multiple challenges, in particular specifying the network architecture in accordance with the underlying structure of the time series. The data generating processes may exhibit a variety of stochastic or deterministic time series patterns of single or multiple seasonality, trends and cycles, overlaid with pulses, level shifts and structural breaks, all depending on the discrete time frequency in which it is observed. For heterogeneous datasets of time series, such as the 2008 ESTSP competition, a universal methodology is required for automatic network specification across varying data patterns and time frequencies. We propose a fully data driven forecasting methodology that combines filter and wrapper approaches for feature selection, including automatic feature evaluation, construction and transformation. The methodology identifies time series patterns, creates and transforms explanatory variables and specifies multilayer perceptrons for heterogeneous sets of time series without expert intervention. Examples of the valid and reliable performance in comparison to established benchmark methods are shown for a set of synthetic time series and for the ESTSP{\textquoteright}08 competition dataset, where the proposed methodology obtained second place.",
keywords = "Time series prediction, Forecasting , Artificial neural networks , Automatic model specification , Feature selection , Input variable selection",
author = "Crone, {Sven F.} and Nikolaos Kourentzes",
note = "The final, definitive version of this article has been published in the Journal, Neurocomputing 73 (10-12), 2010, {\textcopyright} ELSEVIER.",
year = "2010",
month = jun,
doi = "10.1016/j.neucom.2010.01.017",
language = "English",
volume = "73",
pages = "1923--1936",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier Science B.V.",
number = "10-12",

}

RIS

TY - JOUR

T1 - Feature selection for time series prediction - A combined filter and wrapper approach for neural networks

AU - Crone, Sven F.

AU - Kourentzes, Nikolaos

N1 - The final, definitive version of this article has been published in the Journal, Neurocomputing 73 (10-12), 2010, © ELSEVIER.

PY - 2010/6

Y1 - 2010/6

N2 - Modelling artificial neural networks for accurate time series prediction poses multiple challenges, in particular specifying the network architecture in accordance with the underlying structure of the time series. The data generating processes may exhibit a variety of stochastic or deterministic time series patterns of single or multiple seasonality, trends and cycles, overlaid with pulses, level shifts and structural breaks, all depending on the discrete time frequency in which it is observed. For heterogeneous datasets of time series, such as the 2008 ESTSP competition, a universal methodology is required for automatic network specification across varying data patterns and time frequencies. We propose a fully data driven forecasting methodology that combines filter and wrapper approaches for feature selection, including automatic feature evaluation, construction and transformation. The methodology identifies time series patterns, creates and transforms explanatory variables and specifies multilayer perceptrons for heterogeneous sets of time series without expert intervention. Examples of the valid and reliable performance in comparison to established benchmark methods are shown for a set of synthetic time series and for the ESTSP’08 competition dataset, where the proposed methodology obtained second place.

AB - Modelling artificial neural networks for accurate time series prediction poses multiple challenges, in particular specifying the network architecture in accordance with the underlying structure of the time series. The data generating processes may exhibit a variety of stochastic or deterministic time series patterns of single or multiple seasonality, trends and cycles, overlaid with pulses, level shifts and structural breaks, all depending on the discrete time frequency in which it is observed. For heterogeneous datasets of time series, such as the 2008 ESTSP competition, a universal methodology is required for automatic network specification across varying data patterns and time frequencies. We propose a fully data driven forecasting methodology that combines filter and wrapper approaches for feature selection, including automatic feature evaluation, construction and transformation. The methodology identifies time series patterns, creates and transforms explanatory variables and specifies multilayer perceptrons for heterogeneous sets of time series without expert intervention. Examples of the valid and reliable performance in comparison to established benchmark methods are shown for a set of synthetic time series and for the ESTSP’08 competition dataset, where the proposed methodology obtained second place.

KW - Time series prediction

KW - Forecasting

KW - Artificial neural networks

KW - Automatic model specification

KW - Feature selection

KW - Input variable selection

U2 - 10.1016/j.neucom.2010.01.017

DO - 10.1016/j.neucom.2010.01.017

M3 - Journal article

VL - 73

SP - 1923

EP - 1936

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

IS - 10-12

ER -