HDI Lab Seminar: Sholom Schechtman (Ecole Polytechnique), Stochastic subgradient descent (SGD) avoids nonsmooth saddle points
On January 18, 2022 at 18:00 there will be a talk "Stochastic subgradient descent (SGD) avoids nonsmooth saddle points" by Sholom Schechtman (Ecole Polytechnique).
Abstract: In the first part of the talk we will discuss the notion of saddle points of nonsmooth functions. We will show that every critical point of a generic semialgebraic (or more generally definable), not necessarily smooth, function is either a local minimum, an active strict saddle or a sharply repulsive critical point. In the second part of this talk we will exhibit conditions on the perturbation sequence of the SGD to ensure avoidance of active strict saddles and sharply repulsive critical points. As a consequence, we will show that the SGD on a generic semialgebraic (or definable) function converges to a (local) minimizer.