• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Mathematical Foundations of Prediction Theory


In this course we present mathematical methods for solving different machine learning problems (classification, regression, data-mining, clustering, etc.). These methods are used for solving a large variety of applied problems in many domains. In the lectures we consider many auxiliary elements required for the construction of machine learning algorithms from optimization theory, algebra, mathematical statistics, discrete mathematics, etc.

Assistants:
Dmitry Vetrov,
Dmitry Kropotov,
Michael Figurnov,
Sergey Bartunov
Alexandr Kirillov

Course program

  • Introduction to machine learning. Examples of applied tasks. Main problems in machine learning theory.
  • Optimal prediction rules and classifiers. Bayesian classifiers.
  • Estimation of generalization ability. Cross validation. Overfitting problem.
  • Theoretical bounds for generalization ability. Vapnik-Chervonenkis theory. Bias-variance decomposition.
  • ROC analysis.
  • Basic machine learning algorithms: k-NN, linear Fisher discriminant.
  • Linear regression. Least squares algorithm.
  • Support Vector Machine.
  • Cluster analysis. K-means and EM-algorithm for Gaussian mixture.
  • Dimension reduction. Principal Component analysis and Multidimensional Scaling.
  • Survival analysis.

 

Have you spotted a typo?
Highlight it, click Ctrl+Enter and send us a message. Thank you for your help!
To be used only for spelling or punctuation mistakes.