Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level - Normandie Université Accéder directement au contenu
Article Dans Une Revue Annals of the Institute of Statistical Mathematics Année : 2016

Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level

Résumé

Kullback–Leibler divergence is minimized among finite distributions with finite state spaces under various constraints of Shannon entropy. Minimization is closely linked to escort distributions whose main properties related to entropy are proven. This allows a large deviations principle to be stated for the sequence of plug-in empirical estimators of Shannon entropy of any finite distributions. Since no closed-form expression of the rate function can be obtained, an explicit approximating function is constructed. This approximation is accurate enough to provide good results in all applications. Tests of entropy level, using both the large deviations principle and the minimization results, are constructed and shown to have a good behavior in terms of errors.
Fichier non déposé

Dates et versions

hal-02299551 , version 1 (27-09-2019)

Identifiants

Citer

Valerie Girardin, Philippe Regnault. Escort Distributions Minimizing the Kullback-Leibler Divergence for a Large Deviations Principle and Tests of Entropy Level. Annals of the Institute of Statistical Mathematics, 2016, 68 (2), pp.439-468. ⟨10.1007/s10463-014-0501-x⟩. ⟨hal-02299551⟩
47 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More