Skip to content

Latest commit

 

History

History
57 lines (57 loc) · 2.15 KB

2021-07-01-asi21a.md

File metadata and controls

57 lines (57 loc) · 2.15 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Private Adaptive Gradient Methods for Convex Optimization
We study adaptive methods for differentially private convex optimization, proposing and analyzing differentially private variants of a Stochastic Gradient Descent (SGD) algorithm with adaptive stepsizes, as well as the AdaGrad algorithm. We provide upper bounds on the regret of both algorithms and show that the bounds are (worst-case) optimal. As a consequence of our development, we show that our private versions of AdaGrad outperform adaptive SGD, which in turn outperforms traditional SGD in scenarios with non-isotropic gradients where (non-private) Adagrad provably outperforms SGD. The major challenge is that the isotropic noise typically added for privacy dominates the signal in gradient geometry for high-dimensional problems; approaches to this that effectively optimize over lower-dimensional subspaces simply ignore the actual problems that varying gradient geometries introduce. In contrast, we study non-isotropic clipping and noise addition, developing a principled theoretical approach; the consequent procedures also enjoy significantly stronger empirical performance than prior approaches.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
asi21a
0
Private Adaptive Gradient Methods for Convex Optimization
383
392
383-392
383
false
Asi, Hilal and Duchi, John and Fallah, Alireza and Javidbakht, Omid and Talwar, Kunal
given family
Hilal
Asi
given family
John
Duchi
given family
Alireza
Fallah
given family
Omid
Javidbakht
given family
Kunal
Talwar
2021-07-01
Proceedings of the 38th International Conference on Machine Learning
139
inproceedings
date-parts
2021
7
1