Hostname: page-component-78c5997874-fbnjt Total loading time: 0 Render date: 2024-11-10T15:55:55.566Z Has data issue: false hasContentIssue false

BOUNDS ON EXTROPY WITH VARIATIONAL DISTANCE CONSTRAINT

Published online by Cambridge University Press:  06 April 2018

Jianping Yang
Affiliation:
Department of Mathematical Sciences, School of Science, Zhejiang Sci-Tech University, Hangzhou, Zhejiang 310018, People's Republic of China, E-mail: yangjp@zstu.edu.cn
Wanwan Xia
Affiliation:
Department of Statistics and Finance, School of Management, University of Science and Technology of China, Hefei Anhui 230026, People's Republic of China, E-mails: xiaww@mail.ustc.edu.cn; thu@ustc.edu.cn
Taizhong Hu
Affiliation:
Department of Statistics and Finance, School of Management, University of Science and Technology of China, Hefei Anhui 230026, People's Republic of China, E-mails: xiaww@mail.ustc.edu.cn; thu@ustc.edu.cn

Abstract

The relation between extropy and variational distance is studied in this paper. We determine the distribution which attains the minimum or maximum extropy among these distributions within a given variation distance from any given probability distribution, obtain the tightest upper bound on the difference of extropies of any two probability distributions subject to the variational distance constraint, and establish an analytic formula for the confidence interval of an extropy. Such a study parallels to that of Ho and Yeung [3] concerning entropy. However, the proofs of the main results in this paper are different from those in Ho and Yeung [3]. In fact, our arguments can simplify several proofs in Ho and Yeung [3].

Type
Research Article
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Barbour, A.D., Johnson, O., Kontoyiannis, I., & Madiman, M. (2010). Compound Poisson approximation via information functionals. Electronic Journal of Probability, 15: paper no 42, 13441368.Google Scholar
2.Cover, T.M. & Thomas, J.A. (2006). Elements of Information Theory, 2nd ed. New York: Wiley-Interscience.Google Scholar
3.Ho, S.-W. & Yeung, R.W. (2010). The interplay between entropy and variational distance. IEEE Transactions on Information Theory 56 (12): 59065929.Google Scholar
4.Lad, F., Sanfilippo, G., & Agrò, G. (2015). Extropy: Complementary dual of entropy. Statistical Science 30(1): 4058.Google Scholar
5.Ley, C. & Swan, Y. (2013). Local Pinsker inequalities via Stein's discrete density approach. IEEE Transactions on Information Theory 59(9): 55845591.Google Scholar
6.Marshall, A.W., Olkin, I., & Arnold, B.C. (2011). Inequalities: Theory of Majorization and Its Applications, 2nd ed. New York: Springer.Google Scholar
7.Sason, I. & Verdu, S. (2015) Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets. Proceedings of the 2015 IEEE Information Theory Workshop, Jeju, Korea, October 2015, pp. 214218.Google Scholar
8.Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 379–423 and 623656.Google Scholar
9.Topsøe, F. (2001). Basic concepts, identities and inequalities—The toolkit of information theory. Entropy 3: 162190.Google Scholar
10.Verdú, S. (2014) Total variation distance and the distribution of relative information. Information Theory and Applications Workshop (ITA), San Diego, CA, USA, February 2014, pp. 13.Google Scholar
11.Weissman, T., Ordentlich, E., Seroussi, G., Verdú, S., & Weinberger, M. (2005). Universal discrete denoising: Known chanell. IEEE Transactions on Information Theory 51(1): 528.Google Scholar