• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Dean — Ivan Arzhantsev


First Deputy Dean— Tamara Voznesenskaya


Deputy Dean for Research and International Relations - Sergei Obiedkov


Deputy Dean for finance and administration - Irina Plisetskaya


Dean's office

Phone: +7 (495) 772-95-90 * 12332


125319, Moscow, 3 Kochnovsky Proezd (near metro station 'Aeroport'). 

Book chapter
GANs for Biological Image Synthesis

Osokin A., Chessel A., Carazo Salas R. E. et al.

In bk.: Proceedings of the IEEE International Conference on Computer Vision (ICCV 2017). Venice: IEEE, 2017. P. 2252-2261.

The Second Term in the Asymptotics for the Number of Points Moving Along a Metric Graph

Vsevolod L. Chernyshev, Tolchennikov A. A.

Regular and Chaotic Dynamics. 2017. Vol. 22. No. 8. P. 937-948.

A Conditional Information Inequality and its Combinatorial Applications

Vereshchagin N., Kaced T., Romashchenko A.

IEEE Transactions on Information Theory. 2018. Vol. 64. No. 5. P. 3610-3615.

Dual subgradient method with averaging for optimal resource allocation
In print

Nesterov Y., Shikhman V.

European Journal of Operational Research. 2018. P. 1-10.

Finite sample properties of the mean occupancy counts and probabilities
In print

Decrouez G. G., Grabchak M., Paris Q.

Bernoulli: a journal of mathematical statistics and probability. 2018. Vol. 24. No. 3. P. 1910-1941.

Mini-course: Advanced statistical methods. Vladimir Spokoiny (HSE, Russia/Weierstrass Institute, Germany)

Event ended

Advanced statistical methods

Vladimir Spokoiny (HSE, Russia/Weierstrass Institute, Germany)

Course Schedule:

12, 19, 26 February 18:10 – 21:00, room 205 (3, Kochnovsky Proezd)
13, 20, 27 February 18:10 – 21:00, room 509 (3, Kochnovsky Proezd)

Pass to HSE building can be ordered at: kkorshunova@hse.ru

Short Course Description

This course introduces the main notions, approaches, and methods of nonparametric statistics. The main topics include smoothing and regularization, model selection and parameter tuning, structural inference, efficiency and rate efficiency, local and sieve parametric approaches. The study is mainly limited to regression and density models. The topics of this course form an essential basis for working with complex data structures using modern statistical tools.

Course structure: lectures, seminars, exam.


Probability theory, linear algebra, mathematical analysis.

Nonparametric Regression

  • Regression models: design, errors, and response function.
  • Projection estimation. The case of orthogonal design.
  • Bias, variance, risk of estimation, rate and accuracy, smoothness classes.

Regularization and roughness penalty

Ridge regression, roughness penalty, penalized maximum likelihood estimation, impact of regularization, modeling bias, complexity, bias-variance trade-off.

Model selection by SURE and AIC, Cross validation

Problem of model choice. Penalized model selection. Akaike criterion. Stein unbiased risk estimation. Parameter tuning by cross-validation.

Model section by smallest accepted

Ordered Model selection and multiple testing. Smallest accepted rule. Parameter tuning by propagation condition and multiplicity correction.

Wavelet methods: hard and soft thresholding

Sequence space model. Wavelet decomposition. Nonlinear wavelet estimation. Hard and soft thresholding.

Density model: Kernel and projection methods

Nadaraya-Watson estimator, high-order kernels. Rate of estimation. Reduction to sequence space model. Wavelet density estimation.

List of References Required or Recommended for the Course

Course materials/Textbooks:

  1. Wassermann, L. "All of nonparametric statistics." (2006). Springer
  2. Rohde, Charles A. Introductory statistical inference with the likelihood function. Springer International Publishing, 2014.
  3. Tsybakov, Alexandre B. "Introduction to nonparametric estimation" (2009).
  4. Spokoiny, V. and Dickhaus, T. “Basics of moderm mathematical Statistics” (2015) Springer

Additional sources:

  1. Stéphane Boucheron, Olivier Bousquet and Gábor Lugosi “Theory of Classification: a Survey of Some Recent Advances”, ESAIM: P&S, 2005, Vol. 9, p. 323-375
  2. Bartlett, Peter L., Boucheron, Stephane, Gábor Lugosi (2002) “Model Selection and Error Estimation”, Machine Learning (48), 85-113