Home > Research > Publications & Outputs > On optimal multiple changepoint algorithms for ...

Links

Text available via DOI:

View graph of relations

On optimal multiple changepoint algorithms for large data

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

On optimal multiple changepoint algorithms for large data. / Maidstone, Robert; Hocking, Toby; Rigaill, Guillem et al.
In: Statistics and Computing, 15.02.2016.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Maidstone, R., Hocking, T., Rigaill, G., & Fearnhead, P. (2016). On optimal multiple changepoint algorithms for large data. Statistics and Computing. Advance online publication. https://doi.org/10.1007/s11222-016-9636-3

Vancouver

Maidstone R, Hocking T, Rigaill G, Fearnhead P. On optimal multiple changepoint algorithms for large data. Statistics and Computing. 2016 Feb 15. Epub 2016 Feb 15. doi: 10.1007/s11222-016-9636-3

Author

Maidstone, Robert ; Hocking, Toby ; Rigaill, Guillem et al. / On optimal multiple changepoint algorithms for large data. In: Statistics and Computing. 2016.

Bibtex

@article{4173288320224b3dbaa1d5ab2fa54917,
title = "On optimal multiple changepoint algorithms for large data",
abstract = "Many common approaches to detecting changepoints, for example based on statistical criteria such as penalised likelihood or minimum description length, can be formulated in terms of minimising a cost over segmentations. We focus on a class of dynamic programming algorithms that can solve the resulting minimisation problem exactly, and thus find the optimal segmentation under the given statistical criteria. The standard implementation of these dynamic programming methods have a computational cost that scales at least quadratically in the length of the time-series. Recently pruning ideas have been suggested that can speed up the dynamic programming algorithms, whilst still being guaranteed to be optimal, in that they find the true minimum of the cost function. Here we extend these pruning methods, and introduce two new algorithms for segmenting data: FPOP and SNIP. Empirical results show that FPOP is substantially faster than existing dynamic programming methods, and unlike the existing methods its computational efficiency is robust to the number of changepoints in the data. We evaluate the method for detecting copy number variations and observe that FPOP has a computational cost that is even competitive with that of binary segmentation, but can give much more accurate segmentations.",
keywords = "Breakpoints, Dynamic Programming, FPOP , SNIP , Optimal Partitioning, pDPA , PELT, Segment Neighbourhood",
author = "Robert Maidstone and Toby Hocking and Guillem Rigaill and Paul Fearnhead",
year = "2016",
month = feb,
day = "15",
doi = "10.1007/s11222-016-9636-3",
language = "English",
journal = "Statistics and Computing",
issn = "0960-3174",
publisher = "Springer Netherlands",

}

RIS

TY - JOUR

T1 - On optimal multiple changepoint algorithms for large data

AU - Maidstone, Robert

AU - Hocking, Toby

AU - Rigaill, Guillem

AU - Fearnhead, Paul

PY - 2016/2/15

Y1 - 2016/2/15

N2 - Many common approaches to detecting changepoints, for example based on statistical criteria such as penalised likelihood or minimum description length, can be formulated in terms of minimising a cost over segmentations. We focus on a class of dynamic programming algorithms that can solve the resulting minimisation problem exactly, and thus find the optimal segmentation under the given statistical criteria. The standard implementation of these dynamic programming methods have a computational cost that scales at least quadratically in the length of the time-series. Recently pruning ideas have been suggested that can speed up the dynamic programming algorithms, whilst still being guaranteed to be optimal, in that they find the true minimum of the cost function. Here we extend these pruning methods, and introduce two new algorithms for segmenting data: FPOP and SNIP. Empirical results show that FPOP is substantially faster than existing dynamic programming methods, and unlike the existing methods its computational efficiency is robust to the number of changepoints in the data. We evaluate the method for detecting copy number variations and observe that FPOP has a computational cost that is even competitive with that of binary segmentation, but can give much more accurate segmentations.

AB - Many common approaches to detecting changepoints, for example based on statistical criteria such as penalised likelihood or minimum description length, can be formulated in terms of minimising a cost over segmentations. We focus on a class of dynamic programming algorithms that can solve the resulting minimisation problem exactly, and thus find the optimal segmentation under the given statistical criteria. The standard implementation of these dynamic programming methods have a computational cost that scales at least quadratically in the length of the time-series. Recently pruning ideas have been suggested that can speed up the dynamic programming algorithms, whilst still being guaranteed to be optimal, in that they find the true minimum of the cost function. Here we extend these pruning methods, and introduce two new algorithms for segmenting data: FPOP and SNIP. Empirical results show that FPOP is substantially faster than existing dynamic programming methods, and unlike the existing methods its computational efficiency is robust to the number of changepoints in the data. We evaluate the method for detecting copy number variations and observe that FPOP has a computational cost that is even competitive with that of binary segmentation, but can give much more accurate segmentations.

KW - Breakpoints

KW - Dynamic Programming

KW - FPOP

KW - SNIP

KW - Optimal Partitioning

KW - pDPA

KW - PELT

KW - Segment Neighbourhood

U2 - 10.1007/s11222-016-9636-3

DO - 10.1007/s11222-016-9636-3

M3 - Journal article

JO - Statistics and Computing

JF - Statistics and Computing

SN - 0960-3174

ER -