No CrossRef data available.
Article contents
IS COMPLETENESS NECESSARY? ESTIMATION IN NONIDENTIFIED LINEAR MODELS
Published online by Cambridge University Press: 27 February 2025
Abstract
Modern data analysis depends increasingly on estimating models via flexible high-dimensional or nonparametric machine learning methods, where the identification of structural parameters is often challenging and untestable. In linear settings, this identification hinges on the completeness condition, which requires the nonsingularity of a high-dimensional matrix or operator and may fail for finite samples or even at the population level. Regularized estimators provide a solution by enabling consistent estimation of structural or average structural functions, sometimes even under identification failure. We show that the asymptotic distribution in these cases can be nonstandard. We develop a comprehensive theory of regularized estimators, which include methods such as high-dimensional ridge regularization, gradient descent, and principal component analysis (PCA). The results are illustrated for high-dimensional and nonparametric instrumental variable regressions and are supported through simulation experiments.
- Type
- ARTICLES
- Information
- Copyright
- © The Author(s), 2025. Published by Cambridge University Press
Footnotes
We are grateful to the Editor, Co-Editor, and three anonymous referees for helpful comments. We are also grateful to the participants of the Duke workshop, TSE Econometrics seminar, Triangle Econometrics Conference, 4th ISNPS Conference, 2018 NASMES Conference, and Bristol Econometric Study Group. Jean-Pierre Florens acknowledges funding from the French National Research Agency (ANR) under the Investments for the Future program (Investissement d’Avenir, grant ANR-17-EURE-0010) and grant ANR-24-CE26-3681-01. All remaining errors are ours.
References
REFERENCES
