Event Description
I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:
- convergence proofs for gradient descent and stochastic gradient descent
- energy functions and continuous time optimization
- estimate sequences and Nesterov acceleration
and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.
Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.
In person only, since I plan to use the whiteboard (but may be recorded)
More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/