An entropy inequality for symmetric random variables
We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables (X_1, X_2, ..., X_n) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X_1, X_2, ..., X_n). We show that for n ≥ 3, the lower bound is tight if and only if X_i's are i.i.d. Gaussian random variables. For n=2 there are numerous other cases of equality apart from i.i.d. Gaussians, which we completely characterize. Going beyond sums, we also present an inequality for certain linear transformations of (X_1, ..., X_n). Our primary technical contribution lies in the analysis of the equality cases, and our approach relies on the geometry and the symmetry of the problem.
READ FULL TEXT