Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints |
We consider the problem of estimating a $d$-dimensional $s$-sparse discrete distribution from its samples observed under a $b$-bit communication constraint. The best-known previous result on $\ell_2$ estimation error for this problem is $O\left( \frac{s\log\left( {d}/{s}\right)}{n2^b}\right)$. Surprisingly, we show that when sample size $n$ exceeds a minimum threshold $n^*(s, d, b)$, we can achieve an $\ell_2$ estimation error of $O\left( \frac{s}{n2^b}\right)$. This implies that when $n>n^*(s, d, b)$ the convergence rate does not depend on the ambient dimension $d$ and is the same as knowing the support of the distribution beforehand.
We next ask the question: ``what is the minimum $n^*(s, d, b)$ that allows dimension-free convergence?'. To upper bound $n^*(s, d, b)$, we develop novel localization schemes to accurately and efficiently localize the unknown support. For the non-interactive setting, we show that $n^*(s, d, b) = O\left( \min \left( {d^2\log^2 d}/{2^b}, {s^4\log^2 d}/{2^b}\right) \right)$. Moreover, we connect the problem with non-adaptive group testing and obtain a polynomial-time estimation scheme when $n = \tilde{\Omega}\left({s^4\log^4 d}/{2^b}\right)$. This group testing based scheme is adaptive to the sparsity parameter $s$, and hence can be applied without knowing it. For the interactive setting, we propose a novel tree-based estimation scheme and show that the minimum sample-size needed to achieve dimension-free convergence can be further reduced to $n^*(s, d, b) = \tilde{O}\left( {s^2\log^2 d}/{2^b} \right)$. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
chen21a |
0 |
Breaking The Dimension Dependence in Sparse Distribution Estimation under Communication Constraints |
1028 |
1059 |
1028-1059 |
1028 |
false |
Chen, Wei-Ning and Kairouz, Peter and Ozgur, Ayfer |
given |
family |
Wei-Ning |
Chen |
|
given |
family |
Peter |
Kairouz |
|
|
|
2021-07-21 |
|
Proceedings of Thirty Fourth Conference on Learning Theory |
134 |
inproceedings |
|
|
|