• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
ФКН
Contacts

Dean — Ivan Arzhantsev

 

First Deputy Dean — Tamara Voznesenskaya

 

Deputy Dean for Research and International Relations — Sergei Obiedkov

 

Deputy Dean for Methodical and Educational Work — Ilya Samonenko

 

Deputy Dean for Development, Finance and Administration — Irina Plisetskaya

Phone: +7 (495) 772-95-90 * 12332

computerscience@hse.ru

125319, Moscow, 3 Kochnovsky Proezd (near metro station 'Aeroport'). 

Events
Feb 22 – Feb 23
Регистрация открыта 
Mar 21 – Mar 23
Papers Submission Deadline: 15 January 2019 
Jun 12 – Jun 14
submission: Friday, 01 February 2019, notification: Friday, 15 February 2019 
Aug 26 – Aug 30
Registration and Poster Submission deadline — April 1, 2019 
Article
Ontology-Mediated Queries: Combined Complexity and Succinctness of Rewritings via Circuit Complexity

Bienvenu M., Kikot S., Kontchakov R. et al.

Journal of the ACM. 2018. Vol. 65. No. 5. P. 28:1-28:51.

Article
Randomized Block Cubic Newton Method
In press

Doikov Nikita, Richtarik P.

Proceedings of Machine Learning Research. 2018. No. 80. P. 1290-1298.

Article
Particle-identification techniques and performance at LHCb in Run 2
In press

Hushchyn M., Chekalina V.

Nuclear Instruments and Methods in Physics Research, Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. 2018. P. 1-2.

Article
Observational evidence in favor of scale free evolution of sunspot groups

Shapoval A., Le Mouël J., Shnirman M. et al.

Astronomy and Astrophysics. 2018. Vol. 618. P. A183-1-A183-13.

Colloquium: Learning on networks of distributions for discrete data. Speaker: Wray Buntine, Monash University

Event ended

I will motivate the talk by reviewing some state of the art models for problems like matrix factorisation models for link prediction and tweet clustering. Then I will review the classes of distributions that can be strung together in networks to generate discrete data. This allows a rich class of models that, in its simplest form, covers things like Poisson matrix factorisation, Latent Dirichlet allocation, and Stochastic block models, but, more generally, covers complex hierarchical models on network and text data. The distributions covered include so-called non-parametric distributions such as the Gamma process. Accompanying these are a set of collapsing and augmentation techniques that are used to generate fast Gibbs samplers for many models in this class. To complete this picture, turning complex network models into fast Gibbs samplers, I will illustrate our recent methods of doing matrix factorisation with side information (e.g., GloVe word embeddings), done for link prediction, for instance, for citation networks.

Venue:
Moscow, Kochnovsky pr.,3, room 317, 18:10 

Everyone interested is welcome to attend.

If you need a pass to HSE, please contact computerscience@hse.ru