On the entropy and information of Gaussian mixtures

08/30/2023
by   Alexandros Eskenazis, et al.
0

We establish several convexity properties for the entropy and Fisher information of mixtures of centered Gaussian distributions. First, we prove that if X_1, X_2 are independent scalar Gaussian mixtures, then the entropy of √(t)X_1 + √(1-t)X_2 is concave in t ∈ [0,1], thus confirming a conjecture of Ball, Nayar and Tkocz (2016) for this class of random variables. In fact, we prove a generalisation of this assertion which also strengthens a result of Eskenazis, Nayar and Tkocz (2018). For the Fisher information, we extend a convexity result of Bobkov (2022) by showing that the Fisher information matrix is operator convex as a matrix-valued function acting on densities of mixtures in ℝ^d. As an application, we establish rates for the convergence of the Fisher information matrix of the sum of weighted i.i.d. Gaussian mixtures in the operator norm along the central limit theorem under mild moment assumptions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset