Skip to content

Latest commit

 

History

History
52 lines (52 loc) · 1.98 KB

2021-07-21-ding21a.md

File metadata and controls

52 lines (52 loc) · 1.98 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Random Coordinate Langevin Monte Carlo
Langevin Monte Carlo (LMC) is a popular Markov chain Monte Carlo sampling method. One drawback is that it requires the computation of the full gradient at each iteration, an expensive operation if the dimension of the problem is high. We propose a new sampling method: Random Coordinate LMC (RC-LMC). At each iteration, a single coordinate is randomly selected to be updated by a multiple of the partial derivative along this direction plus noise, while all other coordinates remain untouched. We investigate the total complexity of RC-LMC and compare it with the classical LMC for log-concave probability distributions. We show that when the gradient of the log-density is Lipschitz, RC-LMC is less expensive than the classical LMC if the log-density is highly skewed for high dimensional problems. Further, when both the gradient and the Hessian of the log-density are Lipschitz, RC-LMC is always cheaper than the classical LMC, by a factor proportional to the square root of the problem dimension. In the latter case, we use an example to demonstrate that our estimate of complexity is sharp with respect to the dimension.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
ding21a
0
Random Coordinate Langevin Monte Carlo
1683
1710
1683-1710
1683
false
Ding, Zhiyan and Li, Qin and Lu, Jianfeng and Wright, Stephen J
given family
Zhiyan
Ding
given family
Qin
Li
given family
Jianfeng
Lu
given family
Stephen J
Wright
2021-07-21
Proceedings of Thirty Fourth Conference on Learning Theory
134
inproceedings
date-parts
2021
7
21