Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
2 - Foundations of Smooth Optimization
Published online by Cambridge University Press: 31 March 2022
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
Summary
We outline theoretical foundations for smooth optimization problems. First, we define the different types of minimizers (solutions) of unconstrained optimization problems. Next, we state Taylor’s theorem, the fundamental theorem of smooth optimization, which allows us to approximate general smooth functions by simpler (linear or quadratic) functions based on information at the current point. We show how minima can be characterized by optimality conditions involving the gradient or Hessian, which can be checked in practice. Finally, we define the convexity of sets and functions, an important property that arises often in practice and that can be exploited by the algorithms described in the remainder of the book.
Keywords
- Type
- Chapter
- Information
- Optimization for Data Analysis , pp. 15 - 25Publisher: Cambridge University PressPrint publication year: 2022