Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level

Abstract : Kullback–Leibler divergence is minimized among finite distributions with finite state spaces under various constraints of Shannon entropy. Minimization is closely linked to escort distributions whose main properties related to entropy are proven. This allows a large deviations principle to be stated for the sequence of plug-in empirical estimators of Shannon entropy of any finite distributions. Since no closed-form expression of the rate function can be obtained, an explicit approximating function is constructed. This approximation is accurate enough to provide good results in all applications. Tests of entropy level, using both the large deviations principle and the minimization results, are constructed and shown to have a good behavior in terms of errors.
Type de document :
Article dans une revue
Liste complète des métadonnées

https://hal-normandie-univ.archives-ouvertes.fr/hal-02299551
Contributeur : Valerie Girardin <>
Soumis le : vendredi 27 septembre 2019 - 18:12:59
Dernière modification le : lundi 27 avril 2020 - 16:14:03

Identifiants

Collections

Citation

Valerie Girardin, Philippe Regnault. Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level. Annals of the Institute of Statistical Mathematics, Springer Verlag, 2016, 68 (2), pp.439-468. ⟨10.1007/s10463-014-0501-x⟩. ⟨hal-02299551⟩

Partager

Métriques

Consultations de la notice

45