• A
  • A
  • A
  • АБВ
  • АБВ
  • АБВ
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Семинар HDI Lab: Sholom Schechtman (Ecole Polytechnique), Stochastic subgradient descent (SGD) avoids nonsmooth saddle points

Мероприятие завершено

18 января в 18:00 состоится семинар международной лаборатории стохастических алгоритмов и анализа многомерных данных. С докладом "Stochastic subgradient descent (SGD) avoids nonsmooth saddle points" выступит Sholom Schechtman (Ecole Polytechnique).

Аннотация: In the first part of the talk we will discuss the notion of saddle points of nonsmooth functions. We will show that every critical point of a generic semialgebraic (or more generally definable), not necessarily smooth, function is either a local minimum, an active strict saddle or a sharply repulsive critical point. In the second part of this talk we will exhibit conditions on the perturbation sequence of the SGD to ensure avoidance of active strict saddles and sharply repulsive critical points. As a consequence, we will show that the SGD on a generic semialgebraic (or definable) function converges to a (local) minimizer.

Zoom: https://us02web.zoom.us/j/9252879297 

Следите за обновлениями семинара в Telegram!