title | section | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Mitigating Covariate Shift in Misspecified Regression with Applications to Reinforcement Learning |
Original Papers |
A pervasive phenomenon in machine learning applications is \emph{distribution shift}, where training and deployment conditions for a machine learning model differ. As distribution shift typically results in a degradation in performance, much attention has been devoted to algorithmic interventions that mitigate these detrimental effects. This paper studies the effect of distribution shift in the presence of model misspecification, specifically focusing on |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
amortila24a |
0 |
Mitigating Covariate Shift in Misspecified Regression with Applications to Reinforcement Learning |
130 |
160 |
130-160 |
130 |
false |
Amortila, Philip and Cao, Tongyi and Krishnamurthy, Akshay |
|
2024-06-30 |
Proceedings of Thirty Seventh Conference on Learning Theory |
247 |
inproceedings |
|