Preprint Article Version 1 NOT YET PEER-REVIEWED

Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

Frank Nielsen 1,2,* and Ke Sun 1
  1. École Polytechnique, Palaiseau 91128, France
  2. Sony Computer Science Laboratories Inc., Paris 75005, France
Version 1 : Received: 20 October 2016 / Approved: 20 October 2016 / Online: 20 October 2016 (10:35:57 CEST)

How to cite: Nielsen, F.; Sun, K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Preprints 2016, 2016100086 (doi: 10.20944/preprints201610.0086.v1). Nielsen, F.; Sun, K. Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Preprints 2016, 2016100086 (doi: 10.20944/preprints201610.0086.v1).

Abstract

Information-theoretic measures such as the entropy, cross-entropy and the Kullback-Leibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate the versatile method by reporting on our experiments for approximating the Kullback-Leibler divergence between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures, and Gamma mixtures.

Subject Areas

information geometry; mixture models; log-sum-exp bounds

Readers' Comments and Ratings (0)

Discuss and rate this article
Views 155
Downloads 112
Comments 0
Metrics 0
Discuss and rate this article

×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.