Article contents
UNIFORM-IN-SUBMODEL BOUNDS FOR LINEAR REGRESSION IN A MODEL-FREE FRAMEWORK
Published online by Cambridge University Press: 04 June 2021
Abstract
For the last two decades, high-dimensional data and methods have proliferated throughout the literature. Yet, the classical technique of linear regression has not lost its usefulness in applications. In fact, many high-dimensional estimation techniques can be seen as variable selection that leads to a smaller set of variables (a “submodel”) where classical linear regression applies. We analyze linear regression estimators resulting from model selection by proving estimation error and linear representation bounds uniformly over sets of submodels. Based on deterministic inequalities, our results provide “good” rates when applied to both independent and dependent data. These results are useful in meaningfully interpreting the linear regression estimator obtained after exploring and reducing the variables and also in justifying post-model-selection inference. All results are derived under no model assumptions and are nonasymptotic in nature.
- Type
- ARTICLES
- Information
- Econometric Theory , Volume 39 , Issue 6: SPECIAL ISSUE IN HONOR OF BENEDIKT M PÖTSCHER , December 2023 , pp. 1202 - 1248
- Creative Commons
- This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
- Copyright
- © The Author(s), 2021. Published by Cambridge University Press
Footnotes
We would like to thank Abhishek Chakrabortty for discussions that led to Remark 4.5. We would also like to thank the reviewers and the Editor for their constructive comments which have led to a better presentation.
References
REFERENCES



- 3
- Cited by