Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
First-order methods in optimization
Beck A., SIAM-Society for Industrial and Applied Mathematics, Philadelphia, PA, 2017. 494 pp. Type: Book (978-1-611974-98-0)
Date Reviewed: Oct 31 2018

The author of First-order methods in optimization defines the book’s subject matter as optimization methods that rely on function “values and gradients/subgradients (but not Hessians).” The most familiar example of a first-order method is simple gradient descent, which finds function extrema by iteratively moving in a direction proportional to the local gradient. Although more complex methods exist that rely on higher order derivatives, the need for efficient computation argues for the use of first-order methods when convergence and other performance are suitable.

Chapters 1 through 7 comprise a detailed review/discussion of the theoretical foundations that are used later in the book. This complete and well-written section forms nearly half the book. Beck formally presents the mathematics with good notation choices and at an appropriate pace. The subject matter itself makes these chapters somewhat demanding to read, but interested audiences will benefit greatly. I found this first section invaluable; the review was helpful and I encountered a number of topics that I was not familiar with.

The remainder of the book presents a variety of first-order methods, organized by their underlying concept. Chapter 8 states the basic gradient descent method and quickly generalizes the subgradient-based methods, which are useful when the overall gradient of the function to be optimized is not well behaved. For each method, the underlying model is stated, the optimization method is presented and discussed, and the rate of converge is derived. This attention to the convergence rate not only informs on the use of each method, but helps readers better understand the characteristics of the method.

Chapter 9 introduces the mirror descent method, which is an appropriate extension of subgradient methods to a non-Euclidean function space. Chapter 10 discusses proximal gradient methods, using gradient steps followed by some form of proximal mapping. Chapters 11 and 12 relate to other proximal methods: block proximal gradient and dual proximal gradient. Later chapters cover the generalized conditional gradient method and alternating minimization.

I am very impressed with this book. First-order optimization methods are useful and practical in so many situations, but there is a need for a more powerful set of methods than most casual researchers have at their command (myself included). This book is a guide to useful optimization methods in many situations, presented with suitable rigor, and I plan to spend more time with it.

Reviewer:  Creed Jones Review #: CR146300 (1902-0013)
Bookmark and Share
  Reviewer Selected
 
 
Constrained Optimization (G.1.6 ... )
 
 
Complexity Measures And Classes (F.1.3 )
 
Would you recommend this review?
yes
no
Other reviews under "Constrained Optimization": Date
Procedures for optimization problems with a mixture of bounds and general linear constraints
Gill P., Murray W., Saunders M., Wright M. (ed) ACM Transactions on Mathematical Software 10(3): 282-298, 1984. Type: Article
May 1 1985
Symmetric minimum-norm updates for use in Gibbs free energy calculations
Salane D. Mathematical Programming: Series A 36(2): 145-156, 1986. Type: Article
Jun 1 1988
Examination timetabling by computer
Laporte G., Desroches S. Computers and Operations Research 11(4): 351-360, 1984. Type: Article
Aug 1 1985
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy