Dean — Ivan Arzhantsev
First Deputy Dean — Tamara Voznesenskaya
Deputy Dean for Research and International Relations — Sergei Obiedkov
Deputy Dean for Development, Finance and Administration — Irina Plisetskaya
Phone: +7 (495) 772-95-90 * 12332
125319, Moscow, 3 Kochnovsky Proezd (near metro station 'Aeroport').
Mazin P., Jiang X., Fu N. et al.
RNA. 2018. P. 585-596.
Ratnikov F., Баранов А. С., Borisyak M. A. et al.
Journal of High Energy Physics. 2018. Vol. 2018. P. 1-31.
Buzmakov A. V., Kuznetsov S., Napoli A.
In bk.: 2017 IEEE 17th International Conference on Data Mining (ICDM). New Orleans: IEEE, 2017. Ch. 89. P. 757-762.
Figurnov M., Collins M. D., Zhu Y. et al.
In bk.: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017). Curran Associates, Inc., 2017. P. 1039-1048.
Kuz’micheva G. M., Kaurova I. A., Rybakov V. B. et al.
Crystal Growth & Design. 2018. Vol. 18. No. 3. P. 1571-1580.
The faculty trains developers and researchers. The programme has been created based on the experience of leading American and European universities, such as Stanford University (U.S.) and EPFL (Switzerland). Also taken into consideration when creating the faculty was the School of Data Analysis, which is one of the strongest postgraduate schools in the field of computer science in Russia. The wide range of elective courses will allow each student to create his or her own educational path. In the faculty, learning is based on practice and projects.
Vladimir Spokoiny (HSE, Russia/Weierstrass Institute, Germany)
12, 19, 26 February 18:10 – 21:00, room 205 (3, Kochnovsky Proezd)13, 20, 27 February 18:10 – 21:00, room 509 (3, Kochnovsky Proezd)
Pass to HSE building can be ordered at: email@example.com
This course introduces the main notions, approaches, and methods of nonparametric statistics. The main topics include smoothing and regularization, model selection and parameter tuning, structural inference, efficiency and rate efficiency, local and sieve parametric approaches. The study is mainly limited to regression and density models. The topics of this course form an essential basis for working with complex data structures using modern statistical tools.
Course structure: lectures, seminars, exam.
Probability theory, linear algebra, mathematical analysis.
Ridge regression, roughness penalty, penalized maximum likelihood estimation, impact of regularization, modeling bias, complexity, bias-variance trade-off.
Problem of model choice. Penalized model selection. Akaike criterion. Stein unbiased risk estimation. Parameter tuning by cross-validation.
Ordered Model selection and multiple testing. Smallest accepted rule. Parameter tuning by propagation condition and multiplicity correction.
Sequence space model. Wavelet decomposition. Nonlinear wavelet estimation. Hard and soft thresholding.
Nadaraya-Watson estimator, high-order kernels. Rate of estimation. Reduction to sequence space model. Wavelet density estimation.