Silver Stepsize

In this note, I plan to explore new advances in optimization theory. Choosing small decreasing stepsize to ensure monotone decreasing function value may not be a good idea.

List

  • [Math. Program 2024] Acceleration by Stepsize Hedging II: Silver Stepsize Schedule for Smooth Convex Optimization
    Jason M. Altschuler, Pablo A. Parrilo

  • [SIAM J. Optim 2024] Provably faster gradient descent via long steps
    Benjamin Grimmer

Push me :)

I must confess—I’m a bit lazy at the moment. If you’re really interested in any of these topics, feel free to give me a nudge (or a push!) via email to expand on them further (even asking for a chinese version).