Skip to content

Latest commit

 

History

History
47 lines (47 loc) · 1.97 KB

2022-06-28-blanchard22b.md

File metadata and controls

47 lines (47 loc) · 1.97 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Universal Online Learning: an Optimistically Universal Learning Rule
We study the subject of universal online learning with non-i.i.d. processes for bounded losses. The notion of universally consistent learning was defined by Hanneke in an effort to study learning theory under minimal assumptions, where the objective is to obtain low long-run average loss for any target function. We are interested in characterizing processes for which learning is possible and whether there exist learning rules guaranteed to be universally consistent given the only assumption that such learning is possible. The case of unbounded losses is very restrictive since the learnable processes almost surely have to visit a finite number of points and as a result, simple memorization is optimistically universal. We focus on the bounded setting and give a complete characterization of the processes admitting strong and weak universal learning. We further show that the k-nearest neighbor algorithm (kNN) is not optimistically universal and present a novel variant of 1NN which is optimistically universal for general input and value spaces in both strong and weak settings. This closes all the COLT 2021 open problems posed on universal online learning.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
blanchard22b
0
Universal Online Learning: an Optimistically Universal Learning Rule
1077
1125
1077-1125
1077
false
Blanchard, Moise
given family
Moise
Blanchard
2022-06-28
Proceedings of Thirty Fifth Conference on Learning Theory
178
inproceedings
date-parts
2022
6
28