Skip to content

Latest commit

 

History

History
54 lines (54 loc) · 1.95 KB

2021-07-21-chewi21a.md

File metadata and controls

54 lines (54 loc) · 1.95 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm
Conventional wisdom in the sampling literature, backed by a popular diffusion scaling limit, suggests that the mixing time of the Metropolis-Adjusted Langevin Algorithm (MALA) scales as O(d^{1/3}), where d is the dimension. However, the diffusion scaling limit requires stringent assumptions on the target distribution and is asymptotic in nature. In contrast, the best known non-asymptotic mixing time bound for MALA on the class of log-smooth and strongly log-concave distributions is O(d). In this work, we establish that the mixing time of MALA on this class of target distributions is \tilde\Theta(d^{1/2}) under a warm start. Our upper bound proof introduces a new technique based on a projection characterization of the Metropolis adjustment which reduces the study of MALA to the well-studied discretization analysis of the Langevin SDE and bypasses direct computation of the acceptance probability.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
chewi21a
0
Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm
1260
1300
1260-1300
1260
false
Chewi, Sinho and Lu, Chen and Ahn, Kwangjun and Cheng, Xiang and Gouic, Thibaut Le and Rigollet, Philippe
given family
Sinho
Chewi
given family
Chen
Lu
given family
Kwangjun
Ahn
given family
Xiang
Cheng
given family
Thibaut Le
Gouic
given family
Philippe
Rigollet
2021-07-21
Proceedings of Thirty Fourth Conference on Learning Theory
134
inproceedings
date-parts
2021
7
21