• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
Contacts

109028, Moscow,
11, Pokrovsky boulevard

Phone: +7 (495) 531-00-00 *27254

Email: computerscience@hse.ru

 

Administration
First Deputy Dean Tamara Voznesenskaya
Deputy Dean for Research and International Relations Sergei Obiedkov
Deputy Dean for Methodical and Educational Work Ilya Samonenko
Deputy Dean for Development, Finance and Administration Irina Plisetskaya
Article
A randomized coordinate descent method with volume sampling

Rodomanov A., Kropotov D.

SIAM Journal on Optimization. 2020. Vol. 30. No. 3. P. 1878-1904.

Article
ML-assisted versatile approach to Calorimeter R&D

A. Boldyrev, D. Derkach, F. Ratnikov et al.

Journal of Instrumentation. 2020. Vol. 15. P. 1-7.

Article
An accelerated directional derivative method for smooth stochastic convex optimization

Dvurechensky P., Eduard Gorbunov, Gasnikov A.

European Journal of Operational Research. 2021. Vol. 290. No. 2. P. 601-621.

Book chapter
On pattern setups and pattern multistructures

Kuznetsov S., Kaytoue M., Belfodil A.

In bk.: International Journal of General Systems. Iss. 49. 2020. P. 271-285.

Book chapter
Finite Time Analysis of Linear Two-timescale Stochastic Approximation with Markovian Noise

Kaledin M., Moulines E., Naumov A. et al.

In bk.: Proceedings of Machine Learning Research. Vol. 125: Proceedings of Thirty Third Conference on Learning Theory. 2020. P. 2144-2203.

Two papers have been accepted to ICML 2016

Two faculty members have their papers accepted to International Conference on Machine Learning that will be held in New York, USA. 

First paper, authored by Anton Rodomanov and Dmitry Kropotov, "A Superlinearly-Convergent Proximal Newton-type Method for the Optimization of Finite Sums" proposes a new stochastic optimization method with fast convergence properties that is especially useful in machine learning problems. 

Another paper called "Meta-Learning with Memory-Augmented Neural Networks" is co-authored by seniour teaching staff member Sergey Bartunov and is a result of his collaboration with Google DeepMind. In this paper a new neural network architecture is developed that is able to quickly learn new concepts from just a few training examples.