TorchUncertainty is a package designed to help you leverage uncertainty quantification techniques and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!
🚧 TorchUncertainty is in early development 🚧 - expect changes, but reach out and contribute if you are interested in the project! Please raise an issue if you have any bugs or difficulties.
This package provides a multi-level API, including:
- ready-to-train baselines on research datasets, such as ImageNet and CIFAR
- deep learning baselines available for training on your datasets
- pretrained weights for these baselines on ImageNet and CIFAR (work in progress 🚧).
- layers available for use in your networks
- scikit-learn style post-processing methods such as Temperature Scaling
See the Reference page or the API reference for a more exhaustive list of the implemented methods, datasets, metrics, etc.
The package can be installed from PyPI:
pip install torch-uncertainty
Then, install the desired PyTorch version in your environment.
If you aim to contribute (thank you!), have a look at the contribution page.
Please find the documentation at torch-uncertainty.github.io.
A quickstart is available at torch-uncertainty.github.io/quickstart.
To date, the following deep learning baselines have been implemented:
- Deep Ensembles
- MC-Dropout
- BatchEnsemble
- Masksembles
- MIMO
- Packed-Ensembles (see blog post)
- Bayesian Neural Networks 🚧 Work in progress 🚧
- Deep Evidential Regression
To date, the following post-processing methods have been implemented:
- Temperature, Vector, & Matrix scaling
We provide the following tutorials in our documentation:
- From a Vanilla Classifier to a Packed-Ensemble
- Training a Bayesian Neural Network in 3 minutes
- Improve Top-label Calibration with Temperature Scaling
- Deep Evidential Regression on a Toy Example
- Training a LeNet with Monte-Carlo Dropout
You may find a lot of papers about modern uncertainty estimation techniques on the Awesome Uncertainty in Deep Learning.
This package also contains the official implementation of Packed-Ensembles.
If you find the corresponding models interesting, please consider citing our paper:
@inproceedings{laurent2023packed,
title={Packed-Ensembles for Efficient Uncertainty Estimation},
author={Laurent, Olivier and Lafage, Adrien and Tartaglione, Enzo and Daniel, Geoffrey and Martinez, Jean-Marc and Bursuc, Andrei and Franchi, Gianni},
booktitle={ICLR},
year={2023}
}