Skip to content

Latest commit

 

History

History
46 lines (46 loc) · 1.53 KB

2021-07-21-hsu21a.md

File metadata and controls

46 lines (46 loc) · 1.53 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
On the Approximation Power of Two-Layer Networks of Random ReLUs
This paper considers the following question: how well can depth-two ReLU networks with randomly initialized bottom-level weights represent smooth functions? We give near-matching upper- and lower-bounds for L2-approximation in terms of the Lipschitz constant, the desired accuracy, and the dimension of the problem, as well as similar results in terms of Sobolev norms. Our positive results employ tools from harmonic analysis and ridgelet representation theory, while our lower-bounds are based on (robust versions of) dimensionality arguments.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
hsu21a
0
On the Approximation Power of Two-Layer Networks of Random ReLUs
2423
2461
2423-2461
2423
false
Hsu, Daniel and Sanford, Clayton H and Servedio, Rocco and Vlatakis-Gkaragkounis, Emmanouil Vasileios
given family
Daniel
Hsu
given family
Clayton H
Sanford
given family
Rocco
Servedio
given family
Emmanouil Vasileios
Vlatakis-Gkaragkounis
2021-07-21
Proceedings of Thirty Fourth Conference on Learning Theory
134
inproceedings
date-parts
2021
7
21