- http://www.mathos.unios.hr/mc/index.php/mc/article/view/1336
Final published version

Research output: Contribution to Journal/Magazine › Journal article › peer-review

Published

In: Mathematical Communications, Vol. 20, No. 2, 12.2015, p. 219-228.

Research output: Contribution to Journal/Magazine › Journal article › peer-review

Kelbert, M & Mozgunov, P 2015, 'Shannon’s differential entropy asymptotic analysis in a Bayesian problem', *Mathematical Communications*, vol. 20, no. 2, pp. 219-228. <http://www.mathos.unios.hr/mc/index.php/mc/article/view/1336>

Kelbert, M., & Mozgunov, P. (2015). Shannon’s differential entropy asymptotic analysis in a Bayesian problem. *Mathematical Communications*, *20*(2), 219-228. http://www.mathos.unios.hr/mc/index.php/mc/article/view/1336

Kelbert M, Mozgunov P. Shannon’s differential entropy asymptotic analysis in a Bayesian problem. Mathematical Communications. 2015 Dec;20(2):219-228.

@article{1ab87d449d7d404787989d1f51ca5d0b,

title = "Shannon{\textquoteright}s differential entropy asymptotic analysis in a Bayesian problem",

abstract = "We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.",

keywords = "differential entropy, Bayes' formula, Gaussian limit theorem",

author = "Mark Kelbert and Pavel Mozgunov",

year = "2015",

month = dec,

language = "English",

volume = "20",

pages = "219--228",

journal = "Mathematical Communications",

issn = "1331-0623",

publisher = "Udruga Matematicara Osijek",

number = "2",

}

TY - JOUR

T1 - Shannon’s differential entropy asymptotic analysis in a Bayesian problem

AU - Kelbert, Mark

AU - Mozgunov, Pavel

PY - 2015/12

Y1 - 2015/12

N2 - We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.

AB - We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.

KW - differential entropy

KW - Bayes' formula

KW - Gaussian limit theorem

M3 - Journal article

VL - 20

SP - 219

EP - 228

JO - Mathematical Communications

JF - Mathematical Communications

SN - 1331-0623

IS - 2

ER -