Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- List of symbols
- 1 Introduction and message of the book
- PART I POSITIVE POLYNOMIALS AND MOMENT PROBLEMS
- PART II POLYNOMIAL AND SEMI-ALGEBRAIC OPTIMIZATION
- PART III Specializations and extensions
- 13 Convexity in polynomial optimization
- 14 Parametric optimization
- 15 Convex underestimators of polynomials
- 16 Inverse polynomial optimization
- 17 Approximation of sets defined with quantifiers
- 18 Level sets and a generalization of the Löwner–John problem
- Appendix A Semidefinite programming
- Appendix B The GloptiPoly software
- References
- Index
16 - Inverse polynomial optimization
Published online by Cambridge University Press: 05 February 2015
- Frontmatter
- Dedication
- Contents
- Preface
- List of symbols
- 1 Introduction and message of the book
- PART I POSITIVE POLYNOMIALS AND MOMENT PROBLEMS
- PART II POLYNOMIAL AND SEMI-ALGEBRAIC OPTIMIZATION
- PART III Specializations and extensions
- 13 Convexity in polynomial optimization
- 14 Parametric optimization
- 15 Convex underestimators of polynomials
- 16 Inverse polynomial optimization
- 17 Approximation of sets defined with quantifiers
- 18 Level sets and a generalization of the Löwner–John problem
- Appendix A Semidefinite programming
- Appendix B The GloptiPoly software
- References
- Index
Summary
Introduction
Again let P be the polynomial optimization problem f* = inf{f(x) : x ∈ K}, whose feasible set is the basic semi-algebraic set:
K ≔ {x ∈ ℝn : gj(x) ≥ 0, j = 1, …, m},
for some polynomials f, (gj) ⊂ ℝ[x]. As already mentioned, P is in general NP-hard and one goal of this book is precisely to describe methods to obtain (or at least approximate) f* and whenever possible a global minimizer x* ∈ K.
However, in many cases one is often satisfied with a local minimum only (for instance because the methods described in Chapter 6 are computationally too expensive and cannot be implemented for the problem on hand). On the other hand a local minimum can be obtained by running some local minimization algorithm choosen among those available in the literature. Typically in suchalgorithms, at a current iterate (i.e., some feasible solution y ∈ K) one checks whether some optimality conditions (e.g. the Karush–Kuhn–Tucker (KKT) optimality conditions of Chapter 7) are satisfied within some ε-tolerance. However, those conditions are valid for any local minimum, and in fact, even for any stationary point of the Lagrangian. Moreover, in some practical situations the criterion f to minimize is subject to modeling errors or is questionable. In such a situation the practical meaning of a local (or global) minimum f* (and local (or global) minimizer) also becomes questionable. It could well be that the current solution y is in fact a global minimizer of an optimization problem P′ with the same feasible set as P but with a different criterion f. Therefore if f is “close enough” to f one might not be willing to spend an enormous amount of computing time and effort to find the global (or even local) minimum f* because one might be already satisfied with the current iterate y as a global minimizer of P′.
- Type
- Chapter
- Information
- An Introduction to Polynomial and Semi-Algebraic Optimization , pp. 257 - 271Publisher: Cambridge University PressPrint publication year: 2015