Book contents
- Frontmatter
- Contents
- Preface
- 1 An Introduction to Computer-intensive Methods
- 2 Maximum Likelihood
- 3 The Jackknife
- 4 The Bootstrap
- 5 Randomization and Monte Carlo Methods
- 6 Regression Methods
- 7 Bayesian Methods
- References
- Appendix A An Overview of S-PLUS Methods Used in this Book
- Appendix B Brief Description of S-PLUS Subroutines Used in this Book
- Appendix C S-PLUS Codes Cited in Text
- Appendix D Solutions to Exercises
- Index
- References
1 - An Introduction to Computer-intensive Methods
Published online by Cambridge University Press: 09 December 2009
- Frontmatter
- Contents
- Preface
- 1 An Introduction to Computer-intensive Methods
- 2 Maximum Likelihood
- 3 The Jackknife
- 4 The Bootstrap
- 5 Randomization and Monte Carlo Methods
- 6 Regression Methods
- 7 Bayesian Methods
- References
- Appendix A An Overview of S-PLUS Methods Used in this Book
- Appendix B Brief Description of S-PLUS Subroutines Used in this Book
- Appendix C S-PLUS Codes Cited in Text
- Appendix D Solutions to Exercises
- Index
- References
Summary
What are computer-intensive data methods?
For the purposes of this book, I define computer-intensive methods as those that involve an iterative process and hence cannot readily be done except on a computer. The first case I examine is maximum likelihood estimation, which forms the basis of most of the parametric statistics taught in elementary statistical courses, though the derivation of the methods via maximum likelihood is probably not often given. Least squares estimation, for example, can be justified by the principle of maximum likelihood. For the simple cases, such as estimation of the mean, variance, and linear regression analysis, analytical solutions can be obtained, but in more complex cases, such as parameter estimation in nonlinear regression analysis, whereas maximum likelihood can be used to define the appropriate parameters, the solution can only be obtained by numerical methods. Most computer statistical packages now have the option to fit models by maximum likelihood but they typically require one to supply the model (logistic regression is a notable exception).
The other methods discussed in this book may have an equally long history as that of maximum likelihood, but none have been so widely applied as that of maximum likelihood, mostly because, without the aid of computers, the methods are too time-consuming. Even with the aid of a fast computer, the implementation of a computer-intensive method can chew up hours, or even days, of computing time. It is, therefore, imperative that the appropriate technique be selected.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2006