• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Centre of Deep Learning and Bayesian Methods

2

October 20, 2021

35th Conference on Neural Information Processing Systems (NeurIPS 2021) is one of the world's largest conferences on machine learning and neural networks. It takes place on December 6-14, 2021.

March 11, 2021

Two papers were accepted to the 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2021):“On the Embeddings of Variables in Recurrent Neural Networks for Source Code” by Nadezhda Chirkova;“A Simple Approach for Handling Out-of-Vocabulary Identifiers in Deep Learning for Source Code” by Nadezhda Chirkova and Sergey Troshin.The final versions of the papers and the source code will be released soon. The research is conducted with the use of the computational resources of the HSE Supercomputer Modeling Unit.Both papers address the problem of improving the quality of deep learning models for source code by utilizing the specifics of variables and identifiers. The first paper proposes a recurrent architecture that explicitly models the semantic meaning of each variable in the program. The second paper proposes a simple method for preprocessing rarely used identifiers in the program so that a neural network (particularly, Transformer architecture) would better recognize the patterns in the program. The proposed methods were shown to significantly improve the quality of code completion and variable misuse detection.
Publications
Article
A randomized coordinate descent method with volume sampling

Rodomanov A., Kropotov D.

SIAM Journal on Optimization. 2020. Vol. 30. No. 3. P. 1878-1904.

Book chapter
On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay

Lobacheva E., Kodryan M., Chirkova N. et al.

In bk.: Advances in Neural Information Processing Systems 34 (NeurIPS 2021). Curran Associates, Inc., 2021. P. 21545-21556.

Working paper
MARS: Masked Automatic Ranks Selection in Tensor Decompositions

Kodryan M., Kropotov D., Vetrov D.

First Workshop on Quantum Tensor Networks in Machine Learning, NeurIPS 2020. QTNML 2020. First Workshop on Quantum Tensor Networks in Machine Learning, 34th Conference on Neural Information Processing Systems (NeurIPS 2020), 2020

About the Centre

The centre conducts research at the intersection of two actively developing areas of data analysis: deep learning and Bayesian methods of machine learning methods. Deep learning is a section that involves building very complex models (neural networks) to solve problems such as classifying images or music, transferring an art style from picture to photograph, predicting the next words in a text. Within the framework of the Bayesian approach, probabilistic models based on the apparatus of probability theory and mathematical statistics are considered for solving such problems.

The center was created on the basis of the Bayesian Methods Research Group.