Book contents
- Frontmatter
- PART I Regression smoothing
- PART II The kernel method
- 4 How close is the smooth to the true curve?
- 5 Choosing the smoothing parameter
- 6 Data sets with outliers
- 7 Nonparametric regression techniques for correlated data
- 8 Looking for special features and qualitative smoothing
- 9 Incorporating parametric components
- PART III Smoothing in high dimensions
- Appendix 1
- Appendix 2
- References
- Name Index
- Subject Index
4 - How close is the smooth to the true curve?
from PART II - The kernel method
Published online by Cambridge University Press: 05 January 2013
- Frontmatter
- PART I Regression smoothing
- PART II The kernel method
- 4 How close is the smooth to the true curve?
- 5 Choosing the smoothing parameter
- 6 Data sets with outliers
- 7 Nonparametric regression techniques for correlated data
- 8 Looking for special features and qualitative smoothing
- 9 Incorporating parametric components
- PART III Smoothing in high dimensions
- Appendix 1
- Appendix 2
- References
- Name Index
- Subject Index
Summary
It was, of course, fully recognized that the estimate might differ from the parameter in any particular case, and hence that there was a margin of uncertainty. The extent of this uncertainty was expressed in terms of the sampling variance of the estimator.
Sir M. Kendall and A. Stuart (1979, p. 109)If the smoothing parameter is chosen as a suitable function of the sample size n , all of the above smoothers converge to the true curve if the number of observations increases. Of course, the convergence of an estimator is not enough, as Kendall and Stuart in the above citation say. One is always interested in the extent of the uncertainty or at what speed the convergence actually happens. Kendall and Stuart (1979) aptly describe the procedure of assessing measures of accuracy for classical parametric statistics: The extent of the uncertainty is expressed in terms of the sampling variance of the estimator which usually tends to zero at the speed of the square root of the sample size n.
In contrast to this is the nonparametric smoothing situation: The variance alone does not fully quantify the convergence of curve estimators. There is also a bias present which is a typical situation in the context of smoothing techniques. This is the deeper reason why up to this chapter the precision has been measured in terms of pointwise mean squared error (MSE), the sum of variance and squared bias. The variance alone doesn't tell us the whole story if the estimator is biased.
- Type
- Chapter
- Information
- Applied Nonparametric Regression , pp. 89 - 146Publisher: Cambridge University PressPrint publication year: 1990