Skip to content

Latest commit

 

History

History
48 lines (48 loc) · 1.71 KB

2019-06-25-kamath19a.md

File metadata and controls

48 lines (48 loc) · 1.71 KB
abstract section title layout series id month tex_title firstpage lastpage page order cycles bibtex_author author date address publisher container-title volume genre issued pdf extras
We present novel, computationally efficient, and differentially private algorithms for two fundamental high-dimensional learning problems: learning a multivariate Gaussian and learning a product distribution over the Boolean hypercube in total variation distance. The sample complexity of our algorithms nearly matches the sample complexity of the optimal non-private learners for these tasks in a wide range of parameters, showing that privacy comes essentially for free for these problems. In particular, in contrast to previous approaches, our algorithm for learning Gaussians does not require strong a priori bounds on the range of the parameters. Our algorithms introduce a novel technical approach to reducing the sensitivity of the estimation procedure that we call recursive private preconditioning.
contributed
Privately Learning High-Dimensional Distributions
inproceedings
Proceedings of Machine Learning Research
kamath19a
0
Privately Learning High-Dimensional Distributions
1853
1902
1853-1902
1853
false
Kamath, Gautam and Li, Jerry and Singhal, Vikrant and Ullman, Jonathan
given family
Gautam
Kamath
given family
Jerry
Li
given family
Vikrant
Singhal
given family
Jonathan
Ullman
2019-06-25
PMLR
Proceedings of the Thirty-Second Conference on Learning Theory
99
inproceedings
date-parts
2019
6
25