This repository provides code to use the method presented in our DAGM GCPR 2023 paper "Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization". If you want to get started, take a look at our example network and the corresponding jupyter notebook.
If you are only interested in the implementation of the feature steering part of the loss, you can find it in feat_steering_loss(...)
of regression_network.py.
Our method generalizes from debiasing to the encouragement and discouragement of arbitrary features. That is, it not only aims at removing the influence of undesired features / biases but also at increasing the influence of features that are known to be well-established from domain knowledge.
If you use our method, please cite:
@inproceedings{Blunk23:FS,
author = {Jan Blunk and Niklas Penzel and Paul Bodesheim and Joachim Denzler},
booktitle = {DAGM German Conference on Pattern Recognition (DAGM-GCPR)},
title = {Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization},
year = {2023},
}
Install with pip, Python and PyTorch 2.0+
git clone https://git.inf-cv.uni-jena.de/blunk/beyond-debiasing.git
cd beyond-debiasing
pip install -r requirements.txt
First, create an environment with pip and Python first (Anaconda environment / Python virtual environment). We recommend to install PyTorch with CUDA support. Then, you can install all subsequent packages via pip as described above.
Since our method relies on loss regularization, it is very simple to add to your own networks - you only need to modify your loss function. To help with that, we provide an exemplary network and a jupyter notebook with example code.
You can find the implementation of the feature steering part of the loss in feat_steering_loss(...)
of regression_network.py, which is where all the magic of our method takes place.
- Installation:
requirements.txt
: List of required packages for installation with pip
- Feature attribution:
contextual_decomposition.py
: Wrapper for contextual decompositionmixed_cmi_estimator.py
: Python port of the CMIh estimator of the conditional
- Redundant regression dataset:
algebra.py
: Generation of random orthogonal matricesmake_regression.py
: An adapted version of scikit-learns make_regression(...), where the coefficients are standard-uniformregression_dataset.py
: Generation of the redundant regression datasetdataset_utils.py
: Creation of torch dataset from numpy arraystensor_utils.py
: Some helpful functions for dealing with tensors
- Example:
feature_steering_example.ipynb
: Example for generating the dataset, creating and training the network with detailed commentsregression_network.py
: Neural network (PyTorch) used in the example notebook
With mixed_cmi_estimator.py
this repository includes a Python implementation of the hybrid CMI estimator CMIh presented by Zan et al. The authors' original R implementation can be found here.
This repository is released under CC BY 4.0 license, which allows both academic and commercial use. If you need any support, please open an issue or contact Jan Blunk.