Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-13T04:09:44.020Z Has data issue: false hasContentIssue false

Linear Programming Estimators and Bootstrapping for Heavy Tailed Phenomena

Published online by Cambridge University Press:  01 July 2016

Paul D. Feigin*
Affiliation:
Technion—Israel institute of Technology
Sidney I. Resnick*
Affiliation:
Cornell University
*
Postal address: Faculty of Industrial Engineering and Management, Technion—Israel Institute of Technology, Haifa 32000, Israel. Email address: paulf@ie.technion.ac.il
∗∗ Postal address: School of Operations Research and Industrial Engineering, Cornell University, ETC Building, Ithaca, NY14853, USA. Email address: sid@orie.cornell.edu

Abstract

For autoregressive time series with positive innovations which either have heavy right or left tails, linear programming parameter estimates of the autoregressive coefficients have good rates of convergence. However, the asymptotic distribution of the estimators depends heavily on the distribution of the process and thus cannot be used for inference. A bootstrap procedure circumvents this difficulty. We verify the validity of the bootstrap and also give some general comments on the bootstrapping of heavy tailed phenomena.

Type
General Applied Probability
Copyright
Copyright © Applied Probability Trust 1997 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Research supported by US–Israel Binational Science Foundation (BSF) Grant No. 92-00227/2 and NSF Grant DMS-9400535.

References

Andjel, J. (1989) Non-negative autoregressive processes. J. Time Series Anal. 10, 111.CrossRefGoogle Scholar
Athreya, K. (1987) Bootstrap of the mean in the infinite variance case. Ann. Statist. 15, 724731.Google Scholar
Billingsley, P. (1968) Convergence of Probability Measures. Wiley, New York.Google Scholar
Bingham, N., Goldie, C. and Teugels, J. (1987) Regular variation. In Encyclopedia of Mathematics and its Applications. Cambridge University Press, Cambridge.Google Scholar
Brockwell, P. and Davis, R. (1991) Time Series: Theory and Methods. 2nd edn. Springer, New York.Google Scholar
Cline, D. (1983) Estimation and linear prediction for regression, autoregression and ARMA with infinite variance data. Thesis. Colorado State University.Google Scholar
Datta, S. and Mccormick, W. (1995) Bootstrap inference for a first order autoregression with positive innovations. J. Amer. Statist. Assoc. 90, 12891301.Google Scholar
Davis, R. and Mccormick, W. (1989) Estimation for first order autoregressive processes with positive or bounded innovations. Stoch. Proc. Appl. 31, 237250.Google Scholar
Davis, R. and Resnick, S. (1985) Limit theory for moving averages of random variables with regularly varying tail probabilities. Ann. Prob. 13, 179195.Google Scholar
Davis, R. and Resnick, S. (1988) Extremes of moving averages of random variables from the domain of attraction of the double exponential distribution. Stoch. Proc. Appl. 30, 4168.Google Scholar
Deheuvels, P., Mason, D. and Shorack, G. (1993) Some results on the influence of extremes on the bootstrap. Ann. Inst. Henri Poincaré 29, 83103.Google Scholar
Feigin, P. and Resnick, S. (1992) Estimation for autoregressive processes with positive innovations. Stoch. Models 8, 479498.Google Scholar
Feigin, P. and Resnick, S. (1994) Limit distributions for linear programming time series estimators. Stoch. Proc. Appl. 51, 135166.Google Scholar
Feigin, P., Resnick, S. and Starica, C. (1995) Testing for independence in heavy tailed and positive innovation time series. Stoch. Models 11, 587612.Google Scholar
Feller, W. (1971) An Introduction to Probability Theory and its Applications. Vol. II. 2nd edn. Wiley, New York.Google Scholar
Geluk, J. and De Haan, L. (1987) Regular Variation, Extensions and Tauberian Theorems. (CWI Tract 40.) Center for Mathematics and Computer Science, Amsterdam.Google Scholar
Gine, E. and Zinn, J. (1989) Necessary conditions for the bootstrap of the mean. Ann. Statist. 17, 684691.CrossRefGoogle Scholar
De Haan, L. (1970) On Regular Variation and its Application to the Weak Convergence of Sample Extremes. (CWI Tract 32.) Mathematical Centre, Amsterdam.Google Scholar
Hall, P. (1990) Asymptotic properties of the bootstrap for heavy-tailed distributions. Ann. Prob. 18, 13421360.CrossRefGoogle Scholar
Kallenberg, O. (1983) Random Measures. 3rd edn. Akademie, Berlin.Google Scholar
Kinateder, J. (1992) An invariance principle applicable to the bootstrap. In Exploring the Limits of Bootstrap. ed. LePage, R. and Billard, L.. Wiley, New York.Google Scholar
Knight, K. (1989) On the bootstrap of the sample mean in the infinite variance case. Ann. Statist. 17, 11681175.Google Scholar
Lepage, R. (1992) Bootstrapping signs. In Exploring the Limits of Bootstrap. ed. LePage, R. and Billard, L.. Wiley, New York.Google Scholar
Resnick, S. (1986) Point processes, regular variation and weak convergence. Adv. Appl. Prob. 18, 66138.CrossRefGoogle Scholar
Resnick, S. (1987) Extreme Values, Regular Variation, and Point Processes. Springer, New York.Google Scholar
Resnick, S. and Starica, C. (1995) Consistency of Hill's estimator for dependent data. J. Appl. Prob. 32, 139167.Google Scholar
Rudin, W. (1966) Real and Complex Analysis. McGraw-Hill, New York.Google Scholar