• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Семинар группы байесовских методов: Variance Networks. Докладчик: Дмитрий Молчанов, ФКН НИУ ВШЭ

Мероприятие завершено

11 мая в 18:45 в Школе анализа данных состоится очередной семинар группы байесовских методов.

Докладчик: Дмитрий Молчанов, ФКН НИУ ВШЭ

Variance Networks

During this talk, I will introduce variance networks, a model that stores the learned information in the variances of the network weights.

Surprisingly, no information gets stored in the expectations of the weights, therefore if we replace these weights with their expectations, we would obtain a random guess quality prediction.

We will discuss how and why this model works, and will see how it naturally arises in several types of Bayesian Neural Networks.

Then we will discuss a hueristic that uses the loss curvature to determine which random variables can be replaced with their expected values, and see that only a small fraction of weights is needed for ensembling.

The success of this model raises several counter-intuitive implications for the training and application of Deep Learning models.

Место проведения: Школа анализа данных, ул. Тимура Фрунзе, 11к2, аудитория Оксфорд.