Skip to content

Code for our method for actively steering the features learned by a neural network presented in our DAGM GCPR 2023 paper "Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization".

License

Notifications You must be signed in to change notification settings

cvjena/beyond-debiasing

Repository files navigation

Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization

Overview

This repository provides code to use the method presented in our DAGM GCPR 2023 paper "Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization". If you want to get started, take a look at our example network and the corresponding jupyter notebook.

If you are only interested in the implementation of the feature steering part of the loss, you can find it in feat_steering_loss(...) of regression_network.py.

By measuring the feature usage, we can steer the model towards (not) using features that are specifically (un-)desired.

Our method generalizes from debiasing to the encouragement and discouragement of arbitrary features. That is, it not only aims at removing the influence of undesired features / biases but also at increasing the influence of features that are known to be well-established from domain knowledge.

If you use our method, please cite:

@inproceedings{Blunk23:FS,
author = {Jan Blunk and Niklas Penzel and Paul Bodesheim and Joachim Denzler},
booktitle = {DAGM German Conference on Pattern Recognition (DAGM-GCPR)},
title = {Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization},
year = {2023},
}

Installation

Install with pip, Python and PyTorch 2.0+

git clone https://git.inf-cv.uni-jena.de/blunk/beyond-debiasing.git
cd beyond-debiasing
pip install -r requirements.txt

First, create an environment with pip and Python first (Anaconda environment / Python virtual environment). We recommend to install PyTorch with CUDA support. Then, you can install all subsequent packages via pip as described above.

Usage in Python

Since our method relies on loss regularization, it is very simple to add to your own networks - you only need to modify your loss function. To help with that, we provide an exemplary network and a jupyter notebook with example code.

You can find the implementation of the feature steering part of the loss in feat_steering_loss(...) of regression_network.py, which is where all the magic of our method takes place.

Repository

With mixed_cmi_estimator.py this repository includes a Python implementation of the hybrid CMI estimator CMIh presented by Zan et al. The authors' original R implementation can be found here.

License and Support

This repository is released under CC BY 4.0 license, which allows both academic and commercial use. If you need any support, please open an issue or contact Jan Blunk.

About

Code for our method for actively steering the features learned by a neural network presented in our DAGM GCPR 2023 paper "Beyond Debiasing: Actively Steering Feature Selection via Loss Regularization".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published