Published online by Cambridge University Press: 23 March 2023
Many machine learning methods require non-linear optimization, performed by the backward propagation of model errors, with the process complicated by the presence of multiple minima and saddle points. Numerous gradient descent algorithms are available for optimization, including stochastic gradient descent, conjugate gradient, quasi-Newton and non-linear least squares such as Levenberg-Marquardt. In contrast to deterministic optimization, stochastic optimization methods repeatedly introduce randomness during the search process to avoid getting trapped in a local minimum. Evolutionary algorithms, borrowing concepts from evolution to solve optimization problems, include genetic algorithm and differential evolution.
To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Find out more about the Kindle Personal Document Service.
To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.
To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.