• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

125319, Moscow,
3 Kochnovsky Proezd (near metro station 'Aeroport'). 

Phone: +7 (495) 772-95-90 *12332

Email: computerscience@hse.ru



Dean Ivan Arzhantsev

First Deputy Dean Tamara Voznesenskaya

Deputy Dean for Research and International Relations Sergei Obiedkov

Deputy Dean for Methodical and Educational Work Ilya Samonenko

Deputy Dean for Development, Finance and Administration Irina Plisetskaya

Branching rules related to spherical actions on flag varieties
In press

Roman Avdeev, Petukhov A.

Algebras and Representation Theory. 2019.

Minimax theorems for American options without time-consistency

Belomestny D., Kraetschmer V., Hübner T. et al.

Finance and Stochastics. 2019. Vol. 23. P. 209-238.

Cherenkov detectors fast simulation using neural networks

Kazeev N., Derkach D., Ratnikov F. et al.

Nuclear Instruments and Methods in Physics Research, Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 2019.

Book chapter
Averaging Weights Leads to Wider Optima and Better Generalization

Izmailov P., Garipov T., Подоприхин Д. А. et al.

In bk.: Proceedings of the international conference on Uncertainty in Artificial Intelligence (UAI 2018). 2018. P. 876-885.

Colloquium: Perturbed Proximal Gradient Algorithms. Speaker: Eric Moulines (École Polytechnique)

Event ended

February 22, 18:10 – 19:30, room 317

Eric Moulines (École Polytechnique, France) 

Perturbed Proximal Gradient Algorithms

We study a version of the proximal gradient algorithm for which the gradient is intractable and is approximated by Monte Carlo methods (and in particular Markov Chain Monte Carlo). We derive conditions on the step size and the Monte Carlo batch size under which convergence is guaranteed: both increasing batch size and constant batch size are considered. We also derive non-asymptotic bounds for an averaged version. Our results cover both the cases of biased and unbiased Monte Carlo approximation. To support our findings, we discuss the inference of a sparse generalized linear model with random effect and the problem of learning the edge structure and parameters of sparse undirected graphical models.

Venue: Moscow, Kochnovsky proezd, 3, room 317, 18:10

Registration is open.