Speaker: Valerie Girardin
Abstract: Escort distributions, first introduced in multifractals and non-extensive statistical physics, are now involved in numerous fields including coding theory or large deviations principles (LDP). They provide a performing tool for scanning and adapting the structure of the original probability distribution to which they are associated. In this talk, some properties of escort distributions related to entropy will be investigated. Their role in information geometry, the Riemannian geometry induced by the Kullback-Leibler divergence on the linear space of all distributions on a given finite set, is thus highlighted.
The need to minimize the divergence under constraints arises in numerous applications among which statistics. We are here interested in highly non linear entropic constraints that amount to determining first projections and finally the distance between entropic spheres in information geometry. This allows an LDP to be stated for the sequence of plug-in estimators of Shannon entropy of any finite distribution. Depending on the number of modes, the associated good rate function is shown to be either the divergence with respect to the distribution of one of its escorts or a function of its modes's weight. Finally, tests of entropy level using both the LDP and the distance between spheres are constructed and shown to have a good behavior in terms of probability errors.