This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. It begins with the fundamental theory of black-box optimization and proceeds to guide the reader through recent advances in structural optimization and stochastic optimization.
The presentation of black-box optimization, strongly influenced by the seminal book by Nesterov, includes the analysis of cutting plane methods, as well as (accelerated) gradient descent schemes. Special attention is also given to non-Euclidean settings (relevant algorithms include Frank-Wolfe, mirror descent, and dual averaging), and discussing their relevance in machine learning.
The text provides a gentle introduction to structural optimization with FISTA (to optimize a sum of a smooth and a simple non-smooth term), saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods. In stochastic optimization it discusses stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms. It also briefly touches upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : Books Puddle, New York, NY, Etats-Unis
Etat : New. pp. 144. N° de réf. du vendeur 26374567198
Quantité disponible : 4 disponible(s)
Vendeur : Majestic Books, Hounslow, Royaume-Uni
Etat : New. Print on Demand pp. 144. N° de réf. du vendeur 371510977
Quantité disponible : 4 disponible(s)
Vendeur : Biblios, Frankfurt am main, HESSE, Allemagne
Etat : New. PRINT ON DEMAND pp. 144. N° de réf. du vendeur 18374567188
Quantité disponible : 4 disponible(s)