Home > Research > Publications & Outputs > Shannon’s differential entropy asymptotic analy...
View graph of relations

Shannon’s differential entropy asymptotic analysis in a Bayesian problem

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Shannon’s differential entropy asymptotic analysis in a Bayesian problem. / Kelbert, Mark; Mozgunov, Pavel.
In: Mathematical Communications, Vol. 20, No. 2, 12.2015, p. 219-228.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Kelbert M, Mozgunov P. Shannon’s differential entropy asymptotic analysis in a Bayesian problem. Mathematical Communications. 2015 Dec;20(2):219-228.

Author

Kelbert, Mark ; Mozgunov, Pavel. / Shannon’s differential entropy asymptotic analysis in a Bayesian problem. In: Mathematical Communications. 2015 ; Vol. 20, No. 2. pp. 219-228.

Bibtex

@article{1ab87d449d7d404787989d1f51ca5d0b,
title = "Shannon{\textquoteright}s differential entropy asymptotic analysis in a Bayesian problem",
abstract = "We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.",
keywords = "differential entropy, Bayes' formula, Gaussian limit theorem",
author = "Mark Kelbert and Pavel Mozgunov",
year = "2015",
month = dec,
language = "English",
volume = "20",
pages = "219--228",
journal = "Mathematical Communications",
issn = "1331-0623",
publisher = "Udruga Matematicara Osijek",
number = "2",

}

RIS

TY - JOUR

T1 - Shannon’s differential entropy asymptotic analysis in a Bayesian problem

AU - Kelbert, Mark

AU - Mozgunov, Pavel

PY - 2015/12

Y1 - 2015/12

N2 - We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.

AB - We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.

KW - differential entropy

KW - Bayes' formula

KW - Gaussian limit theorem

M3 - Journal article

VL - 20

SP - 219

EP - 228

JO - Mathematical Communications

JF - Mathematical Communications

SN - 1331-0623

IS - 2

ER -