Book contents
- Frontmatter
- Contents
- Preface for the First Edition
- Preface for the Second Edition
- Acronyms and Abbreviations
- Notation
- 1 Introduction
- Part I Estimation Machinery
- 2 Primer on Probability Theory
- 3 Linear-Gaussian Estimation
- 4 Nonlinear Non-Gaussian Estimation
- 5 Handling Nonidealities in Estimation
- 6 Variational Inference
- Part II Three-Dimensional Machinery
- Part III Applications
- Part IV Appendices
- References
- Index
6 - Variational Inference
from Part I - Estimation Machinery
Published online by Cambridge University Press: 11 January 2024
- Frontmatter
- Contents
- Preface for the First Edition
- Preface for the Second Edition
- Acronyms and Abbreviations
- Notation
- 1 Introduction
- Part I Estimation Machinery
- 2 Primer on Probability Theory
- 3 Linear-Gaussian Estimation
- 4 Nonlinear Non-Gaussian Estimation
- 5 Handling Nonidealities in Estimation
- 6 Variational Inference
- Part II Three-Dimensional Machinery
- Part III Applications
- Part IV Appendices
- References
- Index
Summary
This chapter takes a step back and revisits nonlinear estimation through the lens of variational inference, another concept common in the machine learning world. Estimation is posed as minimizing a data-likelihood objective, the Kullback-Leibler divergence between a Gaussian estimate and the true Bayesian posterior. We follow through the consequences of this starting point and show that we can arrive at many of the algorithms presented earlier through appropriate approximations, but can also open the door to new possibilities. For example, a derivative-free batch estimator that uses sigmapoints is discussed. Variational inference also provides a principled approach to learning parameters in our estimators from training data (i.e., parameters of our motion and observation models).
- Type
- Chapter
- Information
- State Estimation for RoboticsSecond Edition, pp. 182 - 214Publisher: Cambridge University PressPrint publication year: 2024
- 1
- Cited by