Совместный семинар HDI Lab & TFAIM Lab «Diffusion models are now at the core of many generative systems»
10 апреля, в 14:40 с докладами выступят Никита Гущин (Skoltech, AIRI), Никита Корнилов (Skoltech, BRAIn Lab MIPT) и David Li (MBZUAI). Доклад пройдет в аудитории R409.
Diffusion models are now at the core of many generative systems , enabling a wide range of practical applications, including text-to-image, text-to-video, image-to-image translation, and many other tasks in continuous domains. At the same time, discrete diffusion models, including diffusion language models, are emerging as a highly promising research direction for language and other discrete domains. However, wherever diffusion models are applied, the next question is always the same: how can we make them fast enough for practical use? This question is often addressed by so-called diffusion distillation methods, that is, post-training approaches that greatly reduce the number of forward passes required for a diffusion model to complete the generation process. In this talk, we will discuss a general framework for distilling essentially any kind of generative model, including matching-based models such as diffusion, flow, and bridge models in both continuous and discrete domains, called inverse distillation. We will outline the main idea and discuss three core papers that introduced and advanced this approach. The first is Inverse Bridge Matching Distillation , which originally proposed inverse distillation for bridge matching models and demonstrated superior results over prior methods for image-to-image diffusion distillation. We will then discuss Universal Inverse Distillation for Matching Models with Real-Data Supervision , which extended the approach to general matching models and showed that the framework naturally incorporates real-data supervision without relying on a standard empirical GAN loss. Finally, we will present the latest work in this direction, Inverse-distilled Diffusion Language Models , which focuses on extending the method to arbitrary types of discrete diffusion models and outperforms prior discrete distillation methods.
Part I. Nikita Gushchin , “ Inverse Bridge Matching Distillation ” (https://arxiv.org/abs/2502.01362, ICML 2025 )
Part II. Nikita Kornilov , “ Universal Inverse Distillation for Matching Models with Real-Data Supervision (No GANs) ” (https://arxiv.org/abs/2509.22459, oral ICLR 2026 )
Part III. David Li , “ IDLM: Inverse-distilled Diffusion Language Models ” (https://arxiv.org/abs/2602.19066)
Семинар будет проводиться на английском языке
По всем вопросам обращайтесь к Зеленовой Карине Михайловне kzelenova@hse.ru или к Горностаевой Екатерине Дмитриевне egornostaeva@hse.ru
