A Scale-invariant Generalization of Renyi Entropy and Related Optimizations under Tsallis' Nonextensive Framework
Entropy and cross-entropy are two very fundamental concepts in information theory and statistical physics and are also widely used for statistical inference across disciplines. In this paper, we will discuss a two parameter generalization of the popular Renyi entropy and associated optimization problems. We will derive the desired entropic characteristics of the new generalized entropy measure including its positivity, expandability, extensivity and generalized (sub-)additivity. More importantly, when considered over the class of sub-probabilities, our new family turns out to be scale invariant; this property does not hold for most of the existing generalized entropy measures. We also propose the corresponding cross-entropy measures, a new two-parameter family that is scale invariant in its first arguments (to be viewed as a variable). The maximization of the new entropy measure and the minimization of the corresponding cross-entropy measure are carried out explicitly under the non-extensive framework and the corresponding properties are derived. In particular, we consider the constraints given by the Tsallis normalized q-expectations that lead to the so-called 'third-choice' non-extensive thermodynamics. In this context, we have come up with, for the first time, a class of entropy measures -- a subfamily of our two-parameter generalization -- that leads to the classical (extensive) Maxwell-Boltzmann theory of exponential-type (Gaussian) MaxEnt distributions under the non-extensive constraints. Our new family indeed provides a wide range of entropy and cross-entropy measures combining both the extensive and nonextensive MaxEnt theories under one umbrella.
READ FULL TEXT