• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Centre of Deep Learning and Bayesian Methods

2

January 06

One paper will be presented at AISTATS (Japan, April 2019) and three papers will be presented at ICLR (USA, May 2019).

December 25, 2018

Researchers of the Faculty of Computer Science presented their papers at the annual conference of Neural Information Processing Systems (NeurIPS), which was held from 2 to 8 December 2018 in Montreal, Canada.
Publications
Article
Randomized Block Cubic Newton Method
In press

Doikov Nikita, Richtarik P.

Proceedings of Machine Learning Research. 2018. No. 80. P. 1290-1298.

Book chapter
Bayesian Sparsification of Gated Recurrent Neural Networks

Lobacheva E., Chirkova N., Vetrov D.

In bk.: Workshop on Compact Deep Neural Network Representation with Industrial Applications, Thirty-second Conference on Neural Information Processing Systems. Montréal: 2018. P. 1-6.

Working paper
Variational Dropout via Empirical Bayes

Kharitonov V., Molchanov D., Vetrov D.

stat.ML. arxiv.org. Cornell University, 2018

About the Center

International Laboratory of Deep Learning and Bayesian Methods is established on the basis of Bayesian Methods Research Group. The group is one of the strongest scientific groups in Russia in the area of machine learning and probabilistic modeling. The laboratory researches the neurobayesian models that combine the advantages of the two most successful machine learning approaches, namely neural networks and Bayesian methods.