title | software | openreview | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HPOD: Hyperparameter Optimization for Unsupervised Outlier Detection |
pypP5uaHxg |
Given an unsupervised outlier detection (OD) algorithm, how can we optimize its hyperparameter(s) (HP) on a new dataset, without using any labels? In this work, we address this challenging hyperparameter optimization for unsupervised OD problem, and propose the first continuous HP search method called HPOD. It capitalizes on the prior performance of a large collection of HPs on existing OD benchmark datasets, and transfers this information to enable HP evaluation on a new dataset without labels. Also, HPOD adapts a prominent, (originally) supervised, sampling paradigm to efficiently identify promising HPs in iterations. Extensive experiments show that HPOD works for both deep (e.g., Robust AutoEncoder (RAE)) and shallow (e.g., Local Outlier Factor (LOF) and Isolation Forest (Forest)) algorithms on discrete and continuous HP spaces. HPOD outperforms a wide range of diverse baselines with 37% improvement on average over the minimal loss HPs of RAE, and 58% and 66% improvement on average over the default HPs of LOF and iForest. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
zhao24a |
0 |
HPOD: Hyperparameter Optimization for Unsupervised Outlier Detection |
2/1 |
24 |
2/1-24 |
2 |
false |
Zhao, Yue and Akoglu, Leman |
|
2024-10-09 |
Proceedings of the Third International Conference on Automated Machine Learning |
256 |
inproceedings |
|