No CrossRef data available.
Published online by Cambridge University Press: 17 February 2009
We apply superadditivity and monotonicity properties associated with the Jensen discrete inequality to derive relationships between the entropy function of a probability vector and a renormalized arbitrary sub-vector. The results are extended to cover other entropy measures such as joint entropy, conditional entropy and mutual information.