Skip to content

Latest commit

 

History

History
50 lines (50 loc) · 1.98 KB

2024-10-09-zhao24a.md

File metadata and controls

50 lines (50 loc) · 1.98 KB
title software openreview abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
HPOD: Hyperparameter Optimization for Unsupervised Outlier Detection
pypP5uaHxg
Given an unsupervised outlier detection (OD) algorithm, how can we optimize its hyperparameter(s) (HP) on a new dataset, without using any labels? In this work, we address this challenging hyperparameter optimization for unsupervised OD problem, and propose the first continuous HP search method called HPOD. It capitalizes on the prior performance of a large collection of HPs on existing OD benchmark datasets, and transfers this information to enable HP evaluation on a new dataset without labels. Also, HPOD adapts a prominent, (originally) supervised, sampling paradigm to efficiently identify promising HPs in iterations. Extensive experiments show that HPOD works for both deep (e.g., Robust AutoEncoder (RAE)) and shallow (e.g., Local Outlier Factor (LOF) and Isolation Forest (Forest)) algorithms on discrete and continuous HP spaces. HPOD outperforms a wide range of diverse baselines with 37% improvement on average over the minimal loss HPs of RAE, and 58% and 66% improvement on average over the default HPs of LOF and iForest.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
zhao24a
0
HPOD: Hyperparameter Optimization for Unsupervised Outlier Detection
2/1
24
2/1-24
2
false
Zhao, Yue and Akoglu, Leman
given family
Yue
Zhao
given family
Leman
Akoglu
2024-10-09
Proceedings of the Third International Conference on Automated Machine Learning
256
inproceedings
date-parts
2024
10
9