• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Математический семинар ФКН "Algorithmic probability and the information distance"

Баувенс Бруно Фредерик Л.
Департамент больших данных и информационного поиска: Доцент

3 октября 2025 в 18:10

Докладчик: Бруно Баувенс, ФКН НИУ ВШЭ

Тема: Algorithmic probability and the information distance

Аннотация: 

The conditional Kolmogorov complexity of a string x given a string y is the minimal length of a program that on input y prints x and halts. Andrei Kolmogorov promoted the study of this notion in 1964 to find a theory that could somehow justify his famous axioms of probability. But to connect to probability, one should use a variant of complexity, which is based on self delimiting programs. This notion can be defined in 4 different ways, one of which is the logarithm of algorithmic probability (in discrete form). This probability was introduced by Solomonoff in 1960 to describe learning in a very general way. 

In various applications, there is a need for a symmetric notion of conditional complexity. The first proposal from 1998 is to consider the minimal length of a program that prints x on input y and also prints y on input x. The authors also prove that the other symmetrized definitions of conditional complexity are close to each other, but conjecture that they do not coincide. Recently, I have proven this conjecture and also showed that the 4 definitions only differ in strange corner cases (for example, one string needs to be exponentially more complex than the other). 

In this talk, I will briefly discuss applications of algorithmic probability and the algorithmic information distance to machinelearning. Then I will prove the coding theorem and its approximate bidirectional variant. Finally, I discuss recent results.

Место проведения: Покровский бульвар 11, аудитория R306.

Регистрация

Добавить в календарь