Skip to content

Latest commit

 

History

History
51 lines (51 loc) · 1.93 KB

2021-07-01-arbour21a.md

File metadata and controls

51 lines (51 loc) · 1.93 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Permutation Weighting
A commonly applied approach for estimating causal effects from observational data is to apply weights which render treatments independent of observed pre-treatment covariates. Recently emphasis has been placed on deriving balancing weights which explicitly target this independence condition. In this work we introduce permutation weighting, a method for estimating balancing weights using a standard binary classifier (regardless of cardinality of treatment). A large class of probabilistic classifiers may be used in this method; the choice of loss for the classifier implies the particular definition of balance. We bound bias and variance in terms of the excess risk of the classifier, show that these disappear asymptotically, and demonstrate that our classification problem directly minimizes imbalance. Additionally, hyper-parameter tuning and model selection can be performed with standard cross-validation methods. Empirical evaluations indicate that permutation weighting provides favorable performance in comparison to existing methods.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
arbour21a
0
Permutation Weighting
331
341
331-341
331
false
Arbour, David and Dimmery, Drew and Sondhi, Arjun
given family
David
Arbour
given family
Drew
Dimmery
given family
Arjun
Sondhi
2021-07-01
Proceedings of the 38th International Conference on Machine Learning
139
inproceedings
date-parts
2021
7
1