Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
10 - Duality and Algorithms
Published online by Cambridge University Press: 31 March 2022
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
Summary
Here, we discuss concepts of duality for convex optimization problems, and algorithms that make use of these concepts. We define the Lagrangian function and its augmented Lagrangian counterpart. We use the Lagrangian to derive optimality conditions for constrained optimization problems in which the constraints are expressed as linear algebraic conditions. We introduce the dual problem, and discuss the concepts of weak and strong duality, and show the existence of positive duality gaps in certain settings. Next, we discuss the dual subgradient method, the augmented Lagrangian method, and the alternating direction method of multipliers (ADMM), which are useful for several types of data science problems.
Keywords
- Type
- Chapter
- Information
- Optimization for Data Analysis , pp. 170 - 187Publisher: Cambridge University PressPrint publication year: 2022