Home > Research > Publications & Outputs > A Good-Turing estimator for feature allocation ...

Links

Text available via DOI:

View graph of relations

A Good-Turing estimator for feature allocation models

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Close
<mark>Journal publication date</mark>1/10/2019
<mark>Journal</mark>Electronic Journal of Statistics
Issue number2
Volume13
Number of pages30
Pages (from-to)3775-3804
Publication StatusPublished
Early online date13/09/19
<mark>Original language</mark>English

Abstract

Feature allocation models generalize classical species sampling models by allowing every observation to belong to more than one species, now called features. Under the popular Bernoulli product model for feature allocation, we assume n observable samples and we consider the problem of estimating the expected number Mn of hitherto unseen features that would be observed if one additional individual was sampled. The interest in estimating Mn is motivated by numerous applied problems where the sampling procedure is expensive, in terms of time and/or financial resources allocated, and further samples can be only motivated by the possibility of recording new unobserved features. We consider a nonparametric estimator M^n of Mn which has the same analytic form of the popular Good-Turing estimator of the missing mass in the context of species sampling models. We show that M^n admits a natural interpretation both as a jackknife estimator and as a nonparametric empirical Bayes estimator. Furthermore, we give provable guarantees for the performance of M^n in terms of minimax rate optimality, and we provide with an interesting connection between M^n and the Good-Turing estimator for species sampling. Finally, we derive non-asymptotic confidence intervals for M^n, which are easily computable and do not rely on any asymptotic approximation. Our approach is illustrated with synthetic data and SNP data from the ENCODE sequencing genome project.