Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- PART 1 GENESIS OF DATA ASSIMILATION
- PART II DATA ASSIMILATION: DETERMINISTIC/STATIC MODELS
- 5 Linear least squares estimation: method of normal equations
- 6 A geometric view: projection and invariance
- 7 Nonlinear least squares estimation
- 8 Recursive least squares estimation
- PART III COMPUTATIONAL TECHNIQUES
- PART IV STATISTICAL ESTIMATION
- PART V DATA ASSIMILATION: STOCHASTIC/STATIC MODELS
- PART VI DATA ASSIMILATION: DETERMINISTIC/DYNAMIC MODELS
- PART VII DATA ASSIMILATION: STOCHASTIC/DYNAMIC MODELS
- PART VIII PREDICTABILITY
- Epilogue
- References
- Index
6 - A geometric view: projection and invariance
from PART II - DATA ASSIMILATION: DETERMINISTIC/STATIC MODELS
Published online by Cambridge University Press: 18 December 2009
- Frontmatter
- Contents
- Preface
- Acknowledgements
- PART 1 GENESIS OF DATA ASSIMILATION
- PART II DATA ASSIMILATION: DETERMINISTIC/STATIC MODELS
- 5 Linear least squares estimation: method of normal equations
- 6 A geometric view: projection and invariance
- 7 Nonlinear least squares estimation
- 8 Recursive least squares estimation
- PART III COMPUTATIONAL TECHNIQUES
- PART IV STATISTICAL ESTIMATION
- PART V DATA ASSIMILATION: STOCHASTIC/STATIC MODELS
- PART VI DATA ASSIMILATION: DETERMINISTIC/DYNAMIC MODELS
- PART VII DATA ASSIMILATION: STOCHASTIC/DYNAMIC MODELS
- PART VIII PREDICTABILITY
- Epilogue
- References
- Index
Summary
In this chapter we revisit the linear least squares estimation problem and solve it using the method of orthogonal projection. This geometric view is quite fundamental and has guided the development and extension of least squares solutions in several directions. In Section 6.1 we describe the basic principles of orthogonal projections, namely, projecting a vector z onto a single vector h. In Section 6.2, we discuss the extension of this idea of projecting a given vector z onto the subspace spanned by the columns of the measurement matrix H ∈ ℝm×n. An interesting outcome of this exercise is that the set of linear equations defining the optimal estimate by this geometric method are identical to those derived from the method of normal equations. This invariance of the least squares solution with respect to the methods underscores the importance of this class of solutions. Section 6.3 develops the geometric equivalent of the weighted or generalized linear least squares problem. It is shown that the optimal solution is given by an oblique projection as opposed to an orthogonal projection. In Section 6.4 we derive conditions for the invariance of least squares solutions under linear transformations of both the model space ℝn and the observation space ℝm. It turns out invariance is achievable within the framework of generalized or weighted least squares formulation.
Orthogonal projection: basic idea
Let h = (h1, h2, …, hm)T ∈ ℝm be the given vector representing the measurement system, and let z = (z1, z2, …, zm)T ∈ ℝm be a set of m observations, where it is assumed that z is not a multiple of h. Refer to Figure 6.1.1.
- Type
- Chapter
- Information
- Dynamic Data AssimilationA Least Squares Approach, pp. 121 - 132Publisher: Cambridge University PressPrint publication year: 2006