• A
  • A
  • A
  • АБВ
  • АБВ
  • АБВ
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Семинар HDI Lab: Егор Клочков (Кембриджский университет), Fast rates for strongly convex optimization via stability

Мероприятие завершено

18 мая 2021 г. в 18:00 состоится очередной семинар международной лаборатории стохастических алгоритмов и анализа многомерных данных. С докладом Fast rates for strongly convex optimization via stability выступит Егор Клочков (Кембриджский университет).

Аннотация: The sharpest known high probability generalization bounds for uniformly stable algorithms (Feldman, Vondrák, 2018, 2019), (Bousquet, Klochkov, Zhivotovskiy, 2020) contain a generally inevitable sampling error term of order 1/sqrt(n). When applied to excess risk bounds, this leads to suboptimal results in several standard stochastic convex optimization problems. We show that if the so-called Bernstein condition is satisfied, the term O(1/sqrt(n)) can be avoided, and high probability excess risk bounds of order up to O(1/n) are possible via uniform stability. Using this result, we show a high probability excess risk bound with the rate O(log n / n) for strongly convex and Lipschitz losses valid for any empirical risk minimization method. This resolves a question of Shalev-Shwartz, Shamir, Srebro, and Sridharan (2009). We discuss how O(logn/n) high probability excess risk bounds are possible for projected gradient descent in the case of strongly convex and Lipschitz losses without the usual smoothness assumption. This is a joint work with Nikita Zhivotovskiy and Olivier Bousquet.


Ссылка на семинар: 
 https://us02web.zoom.us/j/9252879297

Следите за обновлениями семинара в Telegram!