Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-10T16:11:06.098Z Has data issue: false hasContentIssue false

14 - Distributed Statistical Inference with Compressed Data

Published online by Cambridge University Press:  22 March 2021

Miguel R. D. Rodrigues
Affiliation:
University College London
Yonina C. Eldar
Affiliation:
Weizmann Institute of Science, Israel
Get access

Summary

This chapter introduces basic ideas of information-theoretic models for distributed statistical inference problems with compressed data, and discusses current and future research directions and challenges in applying these models to various statistical learning problems. In these applications, data are distributed in multiple terminals, which can communicate with each other via limited-capacity channels. Instead of recovering data at a centralized location first and then performing inference, this chapter describes schemes that can perform statistical inference without recovering the underlying data. Information-theoretic tools are borrowed to characterize the fundamental limits of the classical statistical inference problems using compressed data directly. In this chapter, distributed statistical learning problems are first introduced. Then, models and results of distributed inference are discussed. Finally, new directions that generalize and improve the basic scenarios are described.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Zhang, Y., Duchi, J. C., Jordan, M. I., and Wainwright, M. J., “Information -theoretic lower bounds for distributed statistical estimation with communication constraints,” in Advances in Neural lnformation Processing Systems 26, 2013, pp. 2328–2336.Google Scholar
Shamir, O., Srebro, N., and Zhang, T., “Communication -efficient distributed optimization using an approximate Newton-type method,” in Proc. International Conference on Machine Learning, 2014.Google Scholar
Berger, T., “Decentralized estimation and decision theory,” in Proc. IEEE lnformation Theory Workshop, 1979.Google Scholar
Ahlswede, R. and Csiszár, I., “Hypothesis testing with communication constraints,” IEEE Trans. Information Theory, vol. 32, no. 4, pp. 533–542, July 1986.CrossRefGoogle Scholar
Han, T. S. and Amari, S., “Statistical inference under multiterminal data compression,” IEEE Trans. lnformation Theory, vol. 44, no. 6, pp. 2300–2324, 1998.Google Scholar
Han, T. S., “Hypothesis testing with multiterminal data compression,” IEEE Trans. lnformation Theory, vol. 33, no. 6, pp. 759–772, 1987.Google Scholar
Han, T. S. and Kobayashi, K., “Exponential-type error probabilities for multiterminal hypothesis testing,” IEEE Trans. Information Theory, vol. 35, no. 1, pp. 2–14, 1989.Google Scholar
Shalaby, H. M. H. and Papamarcou, A., “Multiterminal detection with zero-rate data compression,” IEEE Trans. lnformation Theory, vol. 38, no. 2, pp. 254–267, 1992.Google Scholar
Zhao, W. and Lai, L., “distributed detection with vector quantizer,” IEEE Trans. Signal and Information Processing over Networks, vol. 2, no. 2, pp. 105–119, 2016.Google Scholar
Rahman, M. S. and Wagner, A. B., “The optimality of binning for distributed hypothesis testing,” IEEE Trans. Information Theory, vol. 58, no. 10, pp. 6282–6303, 2012.CrossRefGoogle Scholar
Zhao, W. and Lai, L., “Distributed testing with zero-rate compression,” in Proc. IEEE lnternational Symposium on lnformation Theory, 2015.CrossRefGoogle Scholar
Tian, C. and Chen, J., “Successive refinement for hypothesis testing and lossless one-helper problem,” IEEE Trans. lnformation Theory, vol. 54, no. 10, pp. 4666–4681, 2008.Google Scholar
Katz, G., Piantanida, P., Couillet, R., and Debbah, M., “On the necessity of binning for the distributed hypothesis testing problem,” in Proc. IEEE International Symposium on Information Theory, 2015.CrossRefGoogle Scholar
Mhanna, M. and Piantanida, P., “On secure distributed hypothesis testing,” in Proc. IEEE lnternational Symposium on lnformation Theory, 2015.Google Scholar
Zhao, W. and Lai, L., “Distributed testing against independence with multiple terminals,” in Proc. Allerton Conference on Communication, Control, and Computing, 2014, pp. 1246–1251.Google Scholar
w. Zhao and Lai, L., “Distributed testing against independence with conferencing encoders,” in Proc. IEEE lnformation Theory Workshop, 2015.Google Scholar
Zhao, W. and Lai, L., “Distributed testing with cascaded encoders,” IEEE Trans. Information Theory, vol. 64, no. 11, pp. 7339–7348, 2018.Google Scholar
Xiang, Y. and Kim, Y., “Interactive hypothesis testing with communication constraints,” in Proc. Allerton Conference on Communication, Control, and Computing, 2012, pp. 1065–1072.Google Scholar
Xiang, Y. and Kim, Y., “Interactive hypothesis testing against independence,” in Proc. IEEE International Symposium on Information Theory, 2013, pp. 2840–2844.CrossRefGoogle Scholar
Katz, G., Piantanida, P., and Debbah, M., “Collaborative distributed hypothesis testing,” arXiv:1604.01292, 2016.Google Scholar
El, A. Gamal and Kim, Y., Network information theory. Cambridge Unversity Press, 2011.Google Scholar
Ahlswede, R. and Körner, J., “Source coding with side information and a converse for degraded broadcast channels,” IEEE Trans. Information Theory, vol. 21, no. 6, pp. 629–637, 1975.Google Scholar
Slepian, D. and Wolf, J., “Noiseless coding of correlated information sources,” IEEE Trans. lnformation Theory, vol. 19, no. 4, pp. 471–480, 1973.Google Scholar
Ahlswede, R., Gács, P., and Körner, J., “Bounds on conditional probabilities with applications in multi-user communication,” Z. Wahrscheinlichkeitstheorie verwandte Gebiete, vol. 34, no. 2, pp. 157–177, 1976.CrossRefGoogle Scholar
Cover, T. M. and Thomas, J. A., Elements of information theroy. Wiley, 2005.Google Scholar
Zhao, W. and Lai, L., “Distributed identity testing with data compression,” submitted to IEEE Trans. lnformation Theory, 2017.CrossRefGoogle Scholar
Zhao, W. and Lai, L., “Distributed identity testing with zero-rate compression,” in Proc. IEEE International Symposium on Information Theory, 2017, pp. 3135–3139.Google Scholar
Wigger, M. and Timo, R., “Testing against independence with multiple decision centers,” in Proc. IEEE International Conference on Signal Processing and Communications, 2016, pp. 1–5.Google Scholar
Salehkalaibar, S., Wigger, M., and Timo, R., “On hypothesis testing against conditional independence with multiple decision centers,” IEEE Trans. Communications, vol. 66, no. 6, pp. 2409–2420, 2018.Google Scholar
Lee, J. D., Sun, Y., Liu, Q., and Taylor, J. E., “Communication-efficient sparse regression: A one-shot approach,” arXiv:1503.04337, 2015.Google Scholar
McMahan, H. B., Moore, E., Ramage, D., Hampson, S., and Agüera y Arcas, B., “Communication -efficient learning of deep networks from decentralized data,arXiv:1602.05629, 2016.Google Scholar
Konečný, J., “Stochastic, distributed and federated optimization for machine learning,” arXiv:1707.01155, 2017.Google Scholar
Martin, V. M. A., David, K., and Merlinsuganthi, B., “distributed data clustering: A comparative analysis,” Int. J. Sci. Res. Computer Science, Engineering and Information Technol., vol. 3, no. 3, article CSEITI83376, 2018.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×