Home > Research > Publications & Outputs > Improving Power by Conditioning on Less in Post...

Electronic data

  • 2301.05636v2

    Submitted manuscript, 5.6 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Keywords

View graph of relations

Improving Power by Conditioning on Less in Post-selection Inference for Changepoints

Research output: Working paperPreprint

Published

Standard

Harvard

APA

Vancouver

Author

Bibtex

@techreport{047e5eff0895434890ae2e2cca770813,
title = "Improving Power by Conditioning on Less in Post-selection Inference for Changepoints",
abstract = "Post-selection inference has recently been proposed as a way of quantifying uncertainty about detected changepoints. The idea is to run a changepoint detection algorithm, and then re-use the same data to perform a test for a change near each of the detected changes. By defining the p-value for the test appropriately, so that it is conditional on the information used to choose the test, this approach will produce valid p-values. We show how to improve the power of these procedures by conditioning on less information. This gives rise to an ideal selective p-value that is intractable but can be approximated by Monte Carlo. We show that for any Monte Carlo sample size, this procedure produces valid p-values, and empirically that noticeable increase in power is possible with only very modest Monte Carlo sample sizes. Our procedure is easy to implement given existing post-selection inference methods, as we just need to generate perturbations of the data set and re-apply the post-selection method to each of these. On genomic data consisting of human GC content, our procedure increases the number of significant changepoints that are detected from e.g. 17 to 27, when compared to existing methods.",
keywords = "stat.ME",
author = "Rachel Carrington and Paul Fearnhead",
note = "33 pages, 13 figures",
year = "2023",
month = jan,
day = "13",
language = "English",
publisher = "Arxiv",
type = "WorkingPaper",
institution = "Arxiv",

}

RIS

TY - UNPB

T1 - Improving Power by Conditioning on Less in Post-selection Inference for Changepoints

AU - Carrington, Rachel

AU - Fearnhead, Paul

N1 - 33 pages, 13 figures

PY - 2023/1/13

Y1 - 2023/1/13

N2 - Post-selection inference has recently been proposed as a way of quantifying uncertainty about detected changepoints. The idea is to run a changepoint detection algorithm, and then re-use the same data to perform a test for a change near each of the detected changes. By defining the p-value for the test appropriately, so that it is conditional on the information used to choose the test, this approach will produce valid p-values. We show how to improve the power of these procedures by conditioning on less information. This gives rise to an ideal selective p-value that is intractable but can be approximated by Monte Carlo. We show that for any Monte Carlo sample size, this procedure produces valid p-values, and empirically that noticeable increase in power is possible with only very modest Monte Carlo sample sizes. Our procedure is easy to implement given existing post-selection inference methods, as we just need to generate perturbations of the data set and re-apply the post-selection method to each of these. On genomic data consisting of human GC content, our procedure increases the number of significant changepoints that are detected from e.g. 17 to 27, when compared to existing methods.

AB - Post-selection inference has recently been proposed as a way of quantifying uncertainty about detected changepoints. The idea is to run a changepoint detection algorithm, and then re-use the same data to perform a test for a change near each of the detected changes. By defining the p-value for the test appropriately, so that it is conditional on the information used to choose the test, this approach will produce valid p-values. We show how to improve the power of these procedures by conditioning on less information. This gives rise to an ideal selective p-value that is intractable but can be approximated by Monte Carlo. We show that for any Monte Carlo sample size, this procedure produces valid p-values, and empirically that noticeable increase in power is possible with only very modest Monte Carlo sample sizes. Our procedure is easy to implement given existing post-selection inference methods, as we just need to generate perturbations of the data set and re-apply the post-selection method to each of these. On genomic data consisting of human GC content, our procedure increases the number of significant changepoints that are detected from e.g. 17 to 27, when compared to existing methods.

KW - stat.ME

M3 - Preprint

BT - Improving Power by Conditioning on Less in Post-selection Inference for Changepoints

PB - Arxiv

ER -