Buch, Englisch, 560 Seiten, Format (B × H): 160 mm x 241 mm, Gewicht: 1019 g
Reihe: Texts in Applied Mathematics
Buch, Englisch, 560 Seiten, Format (B × H): 160 mm x 241 mm, Gewicht: 1019 g
Reihe: Texts in Applied Mathematics
ISBN: 978-3-031-91416-4
Verlag: Springer Nature Switzerland
This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.
Zielgruppe
Graduate
Autoren/Hrsg.
Weitere Infos & Material
Preface.- Introduction.- Convex sets and convex functions.- Subgradient and mirror descent methods.- Proximal algorithms.- Karush-Kuhn-Tucker theory and Lagrangian duality.- ADMM: alternating direction method of multipliers.- Primal dual splitting algorithms.- Error bound conditions and linear convergence.- Optimization with Kurdyka- Lojasiewicz property.- Semismooth Newton methods.- Stochastic algorithms.- References.- Index.