Potential-function proofs for gradient methods
This note discusses proofs of convergence for gradient methods (also called “first-order methods”) based on simple potential-function arguments. We cover methods like gradient descent (for both smooth and non-smooth settings), mirror descent, and some accelerated variants. We hope the structure and presentation of these amortized-analysis proofs will be useful as a guiding principle in learning and using these proofs.
|Keywords||Convex optimization, Potential function, Amortized analysis|
|Journal||Theory of Computing|
Bansal, N, & Gupta, A. (2019). Potential-function proofs for gradient methods. Theory of Computing, 15. doi:10.4086/toc.2019.v015a004