On Strong Data-Processing and Majorization Inequalities with Applications to Coding Problems

03/31/2021
by   Igal Sason, et al.
0

This work provides data-processing and majorization inequalities for f-divergences, and it considers some of their applications to coding problems. This work also provides tight bounds on the Rényi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one-to-one, and their derivation is based on majorization and the Schur-concavity of the Rényi entropy. One application of the f-divergence inequalities refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, which is induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. In view of the tight bounds for the Rényi entropy and the work by Campbell, non-asymptotic bounds are derived for lossless data compression of discrete memoryless sources.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro