Home > Research > Publications & Outputs > Evaluating and selecting features via informati...

Electronic data

  • EOR16774

    Rights statement: This is the author’s version of a work that was accepted for publication in European Journal of Operational Research. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in European Journal of Operational Research, 290, 1, 2020 DOI: 10.1016/j.ejor.2020.09.028

    Accepted author manuscript, 1.3 MB, PDF document

    Embargo ends: 6/10/22

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Evaluating and selecting features via information theoretic lower bounds of feature inner correlations for high-dimensional data

Research output: Contribution to journalJournal articlepeer-review

E-pub ahead of print
Close
<mark>Journal publication date</mark>6/10/2020
<mark>Journal</mark>European Journal of Operational Research
Issue number1
Volume290
Number of pages13
Pages (from-to)235-247
Publication StatusE-pub ahead of print
Early online date6/10/20
<mark>Original language</mark>English

Abstract

Feature selection is an important preprocessing and interpretable method in the fields where big data plays an essential role. In this paper, we first reformulate and analyze some representative information theoretic feature selection methods from the perspective of approximations of feature inner correlations, and indicate that many of these methods cannot guarantee any theoretical bounds of feature inner correlations. We thus introduce two lower bounds that have very simple forms for feature redundancy and complementarity, and verify that they are closer to the optima than the existing lower bounds applied by some state-of-the-art information theoretic methods. A simple and effective feature selection method based on the proposed lower bounds is then proposed and empirically verified with a wide scope of real-world datasets. The experimental results show that the proposed method achieves promising improvement on feature selection, indicating the effectiveness of the feature criterion consisting of the proposed lower bounds of redundancy and complementarity.

Bibliographic note

This is the author’s version of a work that was accepted for publication in European Journal of Operational Research. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in European Journal of Operational Research, 290, 1, 2020 DOI: 10.1016/j.ejor.2020.09.028