Skip to content

Latest commit

 

History

History
51 lines (51 loc) · 2.09 KB

2021-07-01-alimisis21a.md

File metadata and controls

51 lines (51 loc) · 2.09 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Communication-Efficient Distributed Optimization with Quantized Preconditioners
We investigate fast and communication-efficient algorithms for the classic problem of minimizing a sum of strongly convex and smooth functions that are distributed among $n$ different nodes, which can communicate using a limited number of bits. Most previous communication-efficient approaches for this problem are limited to first-order optimization, and therefore have \emph{linear} dependence on the condition number in their communication complexity. We show that this dependence is not inherent: communication-efficient methods can in fact have sublinear dependence on the condition number. For this, we design and analyze the first communication-efficient distributed variants of preconditioned gradient descent for Generalized Linear Models, and for Newton’s method. Our results rely on a new technique for quantizing both the preconditioner and the descent direction at each step of the algorithms, while controlling their convergence rate. We also validate our findings experimentally, showing faster convergence and reduced communication relative to previous methods.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
alimisis21a
0
Communication-Efficient Distributed Optimization with Quantized Preconditioners
196
206
196-206
196
false
Alimisis, Foivos and Davies, Peter and Alistarh, Dan
given family
Foivos
Alimisis
given family
Peter
Davies
given family
Dan
Alistarh
2021-07-01
Proceedings of the 38th International Conference on Machine Learning
139
inproceedings
date-parts
2021
7
1