Skip to content

Latest commit

 

History

History
56 lines (56 loc) · 1.93 KB

2024-11-25-lin24a.md

File metadata and controls

56 lines (56 loc) · 1.93 KB
title abstract openreview layout series publisher issn id month tex_title cycles bibtex_author author date address container-title volume genre issued pdf extras
A LUPI distillation-based approach: Application to predicting Proximal Junctional Kyphosis
We propose a learning algorithm called XGBoost+, a modified version of the extreme gradient boosting algorithm (XGBoost). The new algorithm utilizes privileged information (PI), data collected after inference time. XGBoost+ incorporates PI into a distillation framework for XGBoost. We also evaluate our proposed method on a real-world clinical dataset about Proximal Junctional Kyphosis (PJK). Our approach outperforms vanilla XGBoost, SVM, and SVM+ on various datasets. Our approach showcases the advantage of using privileged information to improve the performance of machine learning models in healthcare, where data after inference time can be leveraged to build better models.
LvEdt6YqbT
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
lin24a
0
A {LUPI} distillation-based approach: Application to predicting Proximal Junctional Kyphosis
false
Lin, Yun Chao and Clark-Sevilla, Andrea and Ravindranath, Rohith and Hassan, Fthimnir and Reyes, Justin and Lombardi, Joseph and Lenke, Lawrence G. and Salleb-Aouissi, Ansaf
given family
Yun Chao
Lin
given family
Andrea
Clark-Sevilla
given family
Rohith
Ravindranath
given family
Fthimnir
Hassan
given family
Justin
Reyes
given family
Joseph
Lombardi
given family
Lawrence G.
Lenke
given family
Ansaf
Salleb-Aouissi
2024-11-25
Proceedings of the 9th Machine Learning for Healthcare Conference
252
inproceedings
date-parts
2024
11
25