Dean — Ivan Arzhantsev
First Deputy Dean— Tamara Voznesenskaya
Deputy Dean for Research and International Relations - Sergei Obiedkov
Deputy Dean for finance and administration - Irina Plisetskaya
Phone: +7 (495) 772-95-90 * 12332
Moscow, 3 Kochnovsky Proezd (near metro station 'Aeroport').
The faculty trains developers and researchers. The programme has been created based on the experience of leading American and European universities, such as Stanford University (U.S.) and EPFL (Switzerland). Also taken into consideration when creating the faculty was the School of Data Analysis, which is one of the strongest postgraduate schools in the field of computer science in Russia. The wide range of elective courses will allow each student to create his or her own educational path. In the faculty, learning is based on practice and projects.
A. Baranov, Derkach D., Filatov A. et al.
Journal of Physics: Conference Series. 2017. Vol. 934. No. 1. P. 12050-12054.
Gelfand M. S., Kaznadzey А., Shelyakin P.
Biology Direct. 2017.
Gurvich V., Nhan Bao H.
Discrete Applied Mathematics. 2018.
Konushin A., Nikitin M., Konushin V.
Computer Optics. 2017.
Arzhantsev I., Romaskevich E.
Proceedings of the American Mathematical Society. 2017. Vol. 145. No. 5. P. 1865-1879.
Theory of Computing Systems. 2017. Vol. 61. No. 4. P. 1440-1450.
I will motivate the talk by reviewing some state of the art models for problems like matrix factorisation models for link prediction and tweet clustering. Then I will review the classes of distributions that can be strung together in networks to generate discrete data. This allows a rich class of models that, in its simplest form, covers things like Poisson matrix factorisation, Latent Dirichlet allocation, and Stochastic block models, but, more generally, covers complex hierarchical models on network and text data. The distributions covered include so-called non-parametric distributions such as the Gamma process. Accompanying these are a set of collapsing and augmentation techniques that are used to generate fast Gibbs samplers for many models in this class. To complete this picture, turning complex network models into fast Gibbs samplers, I will illustrate our recent methods of doing matrix factorisation with side information (e.g., GloVe word embeddings), done for link prediction, for instance, for citation networks.
Venue:Moscow, Kochnovsky pr.,3, room 317, 18:10
Everyone interested is welcome to attend.
If you need a pass to HSE, please contact firstname.lastname@example.org