Home > Research > Publications & Outputs > The Goldenshluger-Lepski Method for Constrained...

Electronic data

  • paper

    Accepted author manuscript, 398 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

The Goldenshluger-Lepski Method for Constrained Least-Squares Estimators over RKHSs

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

The Goldenshluger-Lepski Method for Constrained Least-Squares Estimators over RKHSs. / Page, Stephen; Grunewalder, Steffen.
In: Bernoulli, Vol. 27, No. 4, 30.11.2021, p. 2241 - 2266.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Page S, Grunewalder S. The Goldenshluger-Lepski Method for Constrained Least-Squares Estimators over RKHSs. Bernoulli. 2021 Nov 30;27(4): 2241 - 2266. Epub 2021 Aug 31. doi: 10.3150/20-BEJ1307

Author

Bibtex

@article{fc7bf61467f1480fb3e79172fb9819b7,
title = "The Goldenshluger-Lepski Method for Constrained Least-Squares Estimators over RKHSs",
abstract = "We study an adaptive estimation procedure called the Goldenshluger–Lepski method in the context of reproducing kernel Hilbert space (RKHS) regression. Adaptive estimation provides a way of selecting tuning parameters for statistical estimators using only the available data. This allows us to perform estimation without making strong assumptions about the estimand. In contrast to procedures such as training and validation, the Goldenshluger–Lepski method uses all of the data to produce non-adaptive estimators for a range of values of the tuning parameters. An adaptive estimator is selected by performing pairwise comparisons between these non-adaptive estimators. Applying the Goldenshluger–Lepski method is non-trivial as it requires a simultaneous high-probability bound on all of the pairwise comparisons. In the RKHS regression context, we choose our non-adaptive estimators to be clipped least-squares estimators constrained to lie in a ball in an RKHS. Applying the Goldenshluger–Lepski method in this context is made more complicated by the fact that we cannot use the L2 norm for performing the pairwise comparisons as it is unknown. We use the method to address two regression problems. In the first problem the RKHS is fixed, while in the second problem we adapt over a collection of RKHSs.",
keywords = "adaptive estimation, Goldenshluger–Lepski method, RKHS regression",
author = "Stephen Page and Steffen Grunewalder",
year = "2021",
month = nov,
day = "30",
doi = "10.3150/20-BEJ1307",
language = "English",
volume = "27",
pages = " 2241 -- 2266",
journal = "Bernoulli",
issn = "1350-7265",
publisher = "International Statistical Institute",
number = "4",

}

RIS

TY - JOUR

T1 - The Goldenshluger-Lepski Method for Constrained Least-Squares Estimators over RKHSs

AU - Page, Stephen

AU - Grunewalder, Steffen

PY - 2021/11/30

Y1 - 2021/11/30

N2 - We study an adaptive estimation procedure called the Goldenshluger–Lepski method in the context of reproducing kernel Hilbert space (RKHS) regression. Adaptive estimation provides a way of selecting tuning parameters for statistical estimators using only the available data. This allows us to perform estimation without making strong assumptions about the estimand. In contrast to procedures such as training and validation, the Goldenshluger–Lepski method uses all of the data to produce non-adaptive estimators for a range of values of the tuning parameters. An adaptive estimator is selected by performing pairwise comparisons between these non-adaptive estimators. Applying the Goldenshluger–Lepski method is non-trivial as it requires a simultaneous high-probability bound on all of the pairwise comparisons. In the RKHS regression context, we choose our non-adaptive estimators to be clipped least-squares estimators constrained to lie in a ball in an RKHS. Applying the Goldenshluger–Lepski method in this context is made more complicated by the fact that we cannot use the L2 norm for performing the pairwise comparisons as it is unknown. We use the method to address two regression problems. In the first problem the RKHS is fixed, while in the second problem we adapt over a collection of RKHSs.

AB - We study an adaptive estimation procedure called the Goldenshluger–Lepski method in the context of reproducing kernel Hilbert space (RKHS) regression. Adaptive estimation provides a way of selecting tuning parameters for statistical estimators using only the available data. This allows us to perform estimation without making strong assumptions about the estimand. In contrast to procedures such as training and validation, the Goldenshluger–Lepski method uses all of the data to produce non-adaptive estimators for a range of values of the tuning parameters. An adaptive estimator is selected by performing pairwise comparisons between these non-adaptive estimators. Applying the Goldenshluger–Lepski method is non-trivial as it requires a simultaneous high-probability bound on all of the pairwise comparisons. In the RKHS regression context, we choose our non-adaptive estimators to be clipped least-squares estimators constrained to lie in a ball in an RKHS. Applying the Goldenshluger–Lepski method in this context is made more complicated by the fact that we cannot use the L2 norm for performing the pairwise comparisons as it is unknown. We use the method to address two regression problems. In the first problem the RKHS is fixed, while in the second problem we adapt over a collection of RKHSs.

KW - adaptive estimation

KW - Goldenshluger–Lepski method

KW - RKHS regression

U2 - 10.3150/20-BEJ1307

DO - 10.3150/20-BEJ1307

M3 - Journal article

VL - 27

SP - 2241

EP - 2266

JO - Bernoulli

JF - Bernoulli

SN - 1350-7265

IS - 4

ER -