Seminar
Parent Program: | -- |
---|---|
Location: | SLMath: Eisenbud Auditorium |
One of the great discoveries of the past half-century of work on gradient-based optimization has been the notion of "oracle complexity," which makes it possible to show that there are optimal rates of convergence for certain classes of optimization problems. Seminal work by Nesterov has yielded "accelerated" algorithms that achieve these optimal rates, but has stopped short of providing a generative framework or an explanation for the rate-optimality of these algorithms. What the oracle model suggests is that there is a well-defined notion of an "optimal way to optimize," but a formalism capturing this notion has not yet emerged. I will argue that this problem has been a singular focus on discrete-time dynamical systems in optimization theory, and that the phenomenon of acceleration is best approached via continuous-time dynamical systems. In particular, I present a variational, Hamiltonian and symplectic framework
that generates oracle-optimal optimization algorithms in some generality.