Conference
Key topics of the conference:
- General Machine Learning
- Deep Learning (architectures, generative models, deep reinforcement learning, etc.)
- Reinforcement learning
- NLP
- Learning Theory (bandits, game theory, statistical learning theory, etc.)
- Optimization (convex and non-convex optimization, matrix/tensor methods, etc.)
- Probabilistic Inference (Bayesian methods, graphical models, Monte Carlo methods, etc.)
- Trustworthy Machine Learning (accountability, causality, fairness, privacy, robustness, etc.)
- Applications (computational biology, crowdsourcing, healthcare, neuroscience, social good, climate science, etc.)
Talks
We invite authors of A* and Q1 papers (2021-2022) to present their work. Each talk will be 15 min long.
Programme 3 November
Session 1&2 10:00 - 11:15 |
Session 1: Applied ML 1 (Big Hall) | ||
Speakers: | Anton Chernyavskiy & Dmitry Ilvovsky | Batch-Softmax Contrastive Loss for Pairwise Sentence Scoring Tasks | |
Alexander Panchenko | ParaDetox: Detoxification with Parallel Data | ||
Alexander Chernyavskiy | Improving Text Generation via Neural Discourse Planning | ||
Sergey Kuznetsov | Δ-Closure Structure for Studying Data Distribution | ||
Nikita Pospelov | Ownership concentration and wealth inequality in Russia | ||
Session 2: Optimization (Small Hall) | |||
Speakers: | Anton Novitskii | The power of first-order smooth optimization for black-box non-smooth problems | |
Darina Dvinskikh | Improved complexity bounds in wasserstein barycenter problem | ||
Aleksandr Beznosikov | Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees | ||
Ekaterina Borodich | Optimal Gradient Sliding and its Application to Optimal Distributed Optimization Under Similarity | ||
Dmitry Yarotski | A new analytic approach to SGD with momentum and its applications: phase transitions and benefit from negative momenta | ||
11:15 - 11:45 | Coffee break | ||
Session 3&4 11:45 - 13:00 |
Session 3: Applied ML 2 (Small Hall) | ||
Speakers: | Ruslan Rakhimov | NPBG++: Accelerating Neural Point-Based Graphics | |
Denis Kuznedelev | oViT: A Sparsification Framework for Vision Transformers | ||
Andrey Savchenko & Liudmila Savchenko | Video-based facial expression recognition and engagement prediction for mobile devices | ||
Alexander Gushchin & Maksim Smirnov & Sergey Lavrushkin | Video compression dataset and benchmark of learning-based video-quality metrics | ||
Ivan Rubachev | On Embeddings for Numerical Features in Tabular Deep Learning | ||
Session 4: Computational ML (Big Hall) | |||
Speakers: | Andrei Chertkov & Konstantin Sozykin | TTOpt: A Maximum Volume Quantized Tensor Train-based Optimization and its Application to Reinforcement Learning | |
Daniil Tiapkin | Optimistic Posterior Sampling for Reinforcement Learning with Few Samples and Tight Guarantees | ||
Sergey Samsonov | Local-Global MCMC kernels: the best of both worlds | ||
Maxim Rakhuba & Alexandra Senderovich | Towards Practical Computation of Singular Values of Convolutional Layers | ||
13:00 - 14:00 | Free time | ||
Session 5&6 14:00 - 15:15 |
Session 5: Theoretical ML (Big Hall) | ||
Speakers: | Nazar Buzun | Strong Gaussian Approximation for the Sum of Random Vectors | |
Nikita Puchkin | Exponential savings in agnostic active learning through abstention | ||
Maxim Kodryan | Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes | ||
Alexey Kornaev | Physics-based loss and machine learning approach in application to non-Newtonian fluids flow modeling | ||
Alexey Naumov | Finite-time High-probability Bounds for Polyak-Ruppert Averaged Iterates of Linear Stochastic Approximation | ||
Session 6: Generative modeling and representation learing (Small Hall) | |||
Speakers: | Fedor Noskov | Nonparametric Uncertainty Quantification for Single Deterministic Neural Network | |
Aybek Alanov | HyperDomainNet: Universal Domain Adaptation for Generative Adversarial Networks | ||
Alexander Korotin | Kantorovich Strikes Back! Wasserstein GANs are not Optimal Transport? | ||
Mikhail Pautov | Smoothed Embeddings for Certified Few-Shot Learning | ||
15:15 - 15:45 | Coffee break | ||
15:45 - 16:00 | Conference Photo | ||
16:00 - 16:05 | MML Olympiad award ceremony | ||
16:05 - 17:30 | Poster session | ||
17:30 - 19:30 | Conference Dinner |
Poster Session
We invite authors of A* papers (2021-2022) to present their work Poster Submission Guidelines: Only posters of submitted and accepted abstracts will be given presentation space. Please note the following information for the preparation of your poster. Please bring your printed poster with you to the School.