No CrossRef data available.
Article contents
PARAMETERS ON THE BOUNDARY IN PREDICTIVE REGRESSION
Published online by Cambridge University Press: 21 February 2025
Abstract
We consider bootstrap inference in predictive (or Granger-causality) regressions when the parameter of interest may lie on the boundary of the parameter space, here defined by means of a smooth inequality constraint. For instance, this situation occurs when the definition of the parameter space allows for the cases of either no predictability or sign-restricted predictability. We show that in this context constrained estimation gives rise to bootstrap statistics whose limit distribution is, in general, random, and thus distinct from the limit null distribution of the original statistics of interest. This is due to both (i) the possible location of the true parameter vector on the boundary of the parameter space and (ii) the possible non-stationarity of the posited predicting (resp. Granger-causing) variable. We discuss a modification of the standard fixed-regressor wild bootstrap scheme where the bootstrap parameter space is shifted by a data-dependent function in order to eliminate the portion of limiting bootstrap randomness attributable to the boundary and prove validity of the associated bootstrap inference under non-stationarity of the predicting variable as the only remaining source of limiting bootstrap randomness. Our approach, which is initially presented in a simple location model, has bearing on inference in parameter-on-the-boundary situations beyond the predictive regression problem.
- Type
- MISCELLANEA
- Information
- Copyright
- © The Author(s), 2025. Published by Cambridge University Press
Footnotes
We thank Yixiao Sun (co-editor), two anonymous referees, Rasmus Søndergaard Pedersen and Mervyn Silvapulle for comments. Financial support from the Italian Ministry of University and Research (PRIN 2020 Grant 2020B2AKFW) is gratefully acknowledged.