Skip to content

Latest commit

 

History

History
62 lines (62 loc) · 2.2 KB

2024-07-24-xu24a.md

File metadata and controls

62 lines (62 loc) · 2.2 KB
title abstract year volume publisher series software layout issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title genre issued pdf extras
From Basic to Extra Features: Hypergraph Transformer Pretrain-then-Finetuning for Balanced Clinical Predictions on EHR
Electronic Health Records (EHRs) contain rich patient information and are crucial for clinical research and practice. In recent years, deep learning models have been applied to EHRs, but they often rely on massive features, which may not be readily available for all patients. We propose \ours{}\footnote{Short for \textbf{H}ypergraph \textbf{T}ransformer \textbf{P}retrain-then-Finetuning with \textbf{S}moo\textbf{t}hness-induced regularization \textbf{a}nd \textbf{R}eweighting.}, which leverages hypergraph structures with a pretrain-then-finetune framework for modeling EHR data, enabling seamless integration of additional features. Additionally, we design two techniques, namely (1) \emph{Smoothness-inducing Regularization} and (2) \emph{Group-balanced Reweighting}, to enhance the model’s robustness during finetuning. Through experiments conducted on two real EHR datasets, we demonstrate that \ours{} consistently outperforms va
2024
248
PMLR
Proceedings of Machine Learning Research
inproceedings
2640-3498
xu24a
0
From Basic to Extra Features: Hypergraph Transformer Pretrain-then-Finetuning for Balanced Clinical Predictions on EHR
182
197
182-197
182
false
Xu, Ran and Lu, Yiwen and Liu, Chang and Chen, Yong and Sun, Yan and Hu, Xiao and Ho, Joyce C and Yang, Carl
given family
Ran
Xu
given family
Yiwen
Lu
given family
Chang
Liu
given family
Yong
Chen
given family
Yan
Sun
given family
Xiao
Hu
given family
Joyce C
Ho
given family
Carl
Yang
2024-07-24
Proceedings of the fifth Conference on Health, Inference, and Learning
inproceedings
date-parts
2024
7
24