Skip to content

Latest commit

 

History

History
52 lines (52 loc) · 2.1 KB

2024-04-18-cao24a.md

File metadata and controls

52 lines (52 loc) · 2.1 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Consistent Hierarchical Classification with A Generalized Metric
In multi-class hierarchical classification, a natural evaluation metric is the tree distance loss that takes the value of two labels’ distance on the pre-defined tree hierarchy. This metric is motivated by that its Bayes optimal solution is the deepest label on the tree whose induced superclass (subtree rooted at it) includes the true label with probability at least $\frac{1}{2}$. However, it can hardly handle the risk sensitivity of different tasks since its accuracy requirement for induced superclasses is fixed at $\frac{1}{2}$. In this paper, we first introduce a new evaluation metric that generalizes the tree distance loss, whose solution’s accuracy constraint $\frac{1+c}{2}$ can be controlled by a penalty value $c$ tailored for different tasks: a higher c indicates the emphasis on prediction’s accuracy and a lower one indicates that on specificity. Then, we propose a novel class of consistent surrogate losses based on an intuitive presentation of our generalized metric and its regret, which can be compatible with various binary losses. Finally, we theoretically derive the regret transfer bounds for our proposed surrogates and empirically validate their usefulness on benchmark datasets.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
cao24a
0
Consistent Hierarchical Classification with A Generalized Metric
4825
4833
4825-4833
4825
false
Cao, Yuzhou and Feng, Lei and An, Bo
given family
Yuzhou
Cao
given family
Lei
Feng
given family
Bo
An
2024-04-18
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics
238
inproceedings
date-parts
2024
4
18