Home > Research > Publications & Outputs > LASSO+DEA for small and big wide data

Electronic data

  • _Chen_Tsionas_Zelenyuk

    Rights statement: This is the author’s version of a work that was accepted for publication in Omega. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Omega, 102, 2021 DOI: 10.1016/j.omega.2021.102419

    Accepted author manuscript, 812 KB, PDF document

    Embargo ends: 21/07/22

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

LASSO+DEA for small and big wide data

Research output: Contribution to journalJournal articlepeer-review

Published
Close
Article number102419
<mark>Journal publication date</mark>31/07/2021
<mark>Journal</mark>Omega
Volume102
Number of pages21
Publication StatusPublished
Early online date21/01/21
<mark>Original language</mark>English

Abstract

In data envelopment analysis (DEA), the curse of dimensionality problem may jeopardize the accuracy or even the relevance of results when there is a relatively large dimension of inputs and outputs, even for relatively large samples. Recently, a machine learning approach based on the least absolute shrinkage and selection operator (LASSO) for variable selection was combined with sign-constrained convex nonparametric least squares (SCNLS, a special case of DEA), and dubbed as LASSO-SCNLS, as a way to circumvent the curse of dimensionality problem. In this paper, we revisit this interesting approach, by considering various data generating processes. We also explore a more advanced version of LASSO, the so-called elastic net (EN) approach, adapt it to DEA and propose the EN-DEA. Our Monte Carlo simulations provide additional and to some extent, new evidence and conclusions. In particular, we find that none of the considered approaches clearly dominate the others. To circumvent the curse of dimensionality of DEA in the context of big wide data, we also propose a simplified two-step approach which we call LASSO+DEA. We find that the proposed simplified approach could be more useful than the existing more sophisticated approaches for reducing very large dimensions into sparser, more parsimonious DEA models that attain greater discriminatory power and suffer less from the curse of dimensionality.

Bibliographic note

This is the author’s version of a work that was accepted for publication in Omega. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Omega, 102, 2021 DOI: 10.1016/j.omega.2021.102419