Семинар HDI Lab: Nonasymptotic Analysis of Stochastic Gradient Descent with the Richardson–Romberg Extrapolation
В этот четверг, 20 февраля, в 16:20 состоится очередной семинар. С докладом выступит Марина Шешукова (НИУ ВШЭ и Сколтех)
Nonasymptotic Analysis of Stochastic Gradient Descent with the Richardson–Romberg Extrapolation
We address the problem of solving strongly convex and smooth minimization problems using stochastic gradient descent (SGD) algorithm with a constant step size. Previous works suggested to combine the Polyak-Ruppert averaging procedure with the Richardson-Romberg extrapolation technique to reduce the asymptotic bias of SGD at the expense of a mild increase of the variance. We significantly extend previous results by providing an expansion of the mean-squared error of the resulting estimator with respect to the number of iterations n. More precisely, we show that the mean-squared error can be decomposed into the sum of two terms: a leading one of order O(n^{−1/2}) with explicit dependence on a minimax-optimal asymptotic covariance matrix, and a second-order term of order O(n^{−3/4}) where the power 3/4 is best known. We also extend this result to the p-th moment bound keeping optimal scaling of the remainders with respect to n. Our analysis relies on the properties of the SGD iterates viewed as a time-homogeneous Markov chain. In particular, we establish that this chain is geometrically ergodic with respect to a suitably defined weighted Wasserstein semimetric
Доклад основан на работе https://arxiv.org/pdf/2410.05106 (ICLR-2025)