title | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles |
Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms which make queries in a (large) bounded domain and which receive gradients and function values corrupted by a (small) amount of noise. We show that acceleration remains unachievable for any deterministic algorithm which receives exact gradient and function-value information (unbounded queries, no noise). Our results hold for a large class of Hadamard manifolds including hyperbolic spaces and the symmetric space |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
criscitiello22a |
0 |
Negative curvature obstructs acceleration for strongly geodesically convex optimization, even with exact first-order oracles |
496 |
542 |
496-542 |
496 |
false |
Criscitiello, Christopher and Boumal, Nicolas |
|
2022-06-28 |
Proceedings of Thirty Fifth Conference on Learning Theory |
178 |
inproceedings |
|