Skip to Main content Skip to Navigation
Journal articles

Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level

Abstract : Kullback–Leibler divergence is minimized among finite distributions with finite state spaces under various constraints of Shannon entropy. Minimization is closely linked to escort distributions whose main properties related to entropy are proven. This allows a large deviations principle to be stated for the sequence of plug-in empirical estimators of Shannon entropy of any finite distributions. Since no closed-form expression of the rate function can be obtained, an explicit approximating function is constructed. This approximation is accurate enough to provide good results in all applications. Tests of entropy level, using both the large deviations principle and the minimization results, are constructed and shown to have a good behavior in terms of errors.
Complete list of metadatas

https://hal-normandie-univ.archives-ouvertes.fr/hal-02299551
Contributor : Valerie Girardin <>
Submitted on : Friday, September 27, 2019 - 6:12:59 PM
Last modification on : Monday, April 27, 2020 - 4:14:03 PM

Identifiers

Collections

Citation

Valerie Girardin, Philippe Regnault. Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level. Annals of the Institute of Statistical Mathematics, Springer Verlag, 2016, 68 (2), pp.439-468. ⟨10.1007/s10463-014-0501-x⟩. ⟨hal-02299551⟩

Share

Metrics

Record views

50