Skip to content

Latest commit

 

History

History
48 lines (48 loc) · 1.69 KB

2022-06-28-li22b.md

File metadata and controls

48 lines (48 loc) · 1.69 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Statistical Estimation and Online Inference via Local SGD
We analyze the novel Local SGD in federated Learning, a multi-round estimation procedure that uses intermittent communication to improve communication efficiency. Under a $2{+}\delta$ moment condition on stochastic gradients, we first establish a {\it functional central limit theorem} that shows the averaged iterates of Local SGD converge weakly to a rescaled Brownian motion. We next provide two iterative inference methods: the {\it plug-in} and the {\it random scaling}. Random scaling constructs an asymptotically pivotal statistic for inference by using the information along the whole Local SGD path. Both the methods are communication efficient and applicable to online data. Our results show that Local SGD simultaneously achieves both statistical efficiency and communication efficiency.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
li22b
0
Statistical Estimation and Online Inference via Local SGD
1613
1661
1613-1661
1613
false
Li, Xiang and Liang, Jiadong and Chang, Xiangyu and Zhang, Zhihua
given family
Xiang
Li
given family
Jiadong
Liang
given family
Xiangyu
Chang
given family
Zhihua
Zhang
2022-06-28
Proceedings of Thirty Fifth Conference on Learning Theory
178
inproceedings
date-parts
2022
6
28