Article contents
A derivation of the information criteria for selecting autoregressive models
Published online by Cambridge University Press: 01 July 2016
Abstract
The Akaike information criterion, AIC, for autoregressive model selection is derived by adopting −2T times the expected predictive density of a future observation of an independent process as a loss function, where T is the length of the observed time series. The conditions under which AIC provides an asymptotically unbiased estimator of the corresponding risk function are derived. When the unbiasedness property fails, the use of AIC is justified heuristically. However, a method for estimating the risk function, which is applicable for all fitted orders, is given. A derivation of the generalized information criterion, AICα, is also given; the loss function used being obtained by a modification of the Kullback-Leibler information measure. Results paralleling those for AIC are also obtained for the AICα criterion.
- Type
- Research Article
- Information
- Copyright
- Copyright © Applied Probability Trust 1986
References
- 13
- Cited by