Book contents
- Frontmatter
- Contents
- Contributors
- Preface
- 1 The Modern Mathematics of Deep Learning
- 2 Generalization in Deep Learning
- 3 Expressivity of Deep Neural Networks
- 4 Optimization Landscape of Neural Networks
- 5 Explaining the Decisions of Convolutional and Recurrent Neural Networks
- 6 Stochastic Feedforward Neural Networks: Universal Approximation
- 7 Deep Learning as Sparsity-Enforcing Algorithms
- 8 The Scattering Transform
- 9 Deep Generative Models and Inverse Problems
- 10 Dynamical Systems andOptimal Control Approach to Deep Learning
- 11 Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks
2 - Generalization in Deep Learning
Published online by Cambridge University Press: 29 November 2022
- Frontmatter
- Contents
- Contributors
- Preface
- 1 The Modern Mathematics of Deep Learning
- 2 Generalization in Deep Learning
- 3 Expressivity of Deep Neural Networks
- 4 Optimization Landscape of Neural Networks
- 5 Explaining the Decisions of Convolutional and Recurrent Neural Networks
- 6 Stochastic Feedforward Neural Networks: Universal Approximation
- 7 Deep Learning as Sparsity-Enforcing Algorithms
- 8 The Scattering Transform
- 9 Deep Generative Models and Inverse Problems
- 10 Dynamical Systems andOptimal Control Approach to Deep Learning
- 11 Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks
Summary
This chapter provides theoreticalinsights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, non-robustness, and sharp minima, responding to an open question in the literature. We also discuss approaches to provide non-vacuousgeneralization guarantees for deep learning. On the basis of the theoreticalobservations, wepropose new open problems.
Keywords
- Type
- Chapter
- Information
- Mathematical Aspects of Deep Learning , pp. 112 - 148Publisher: Cambridge University PressPrint publication year: 2022
- 29
- Cited by