Skip to content

Latest commit

 

History

History
42 lines (28 loc) · 2.15 KB

README.md

File metadata and controls

42 lines (28 loc) · 2.15 KB

GIB: Gated Information Bottleneck for Generalization in Sequential Environments

Pytorch implmentation of the Gated Infromation Bottleneck (GIB) for Generalization in Sequential Environments (https://arxiv.org/abs/2110.06057)

Abstract

Deep neural networks suffer from poor generalization to unseen environments when the underlying data distribution is different from that in the training set. By learning minimum sufficient representations from training data, the information bottleneck (IB) approach has demonstrated its effectiveness to improve generalization in different AI applications. In this work, we propose a new neural network-based IB approach, termed gated information bottleneck (GIB), that dynamically drops spurious correlations and progressively selects the most task-relevant features across different environments by a trainable soft mask (on raw features). GIB enjoys a simple and tractable objective, without any variational approximation or distributional assumption. We empirically demonstrate the superiority of GIB over other popular neural network-based IB approaches in adversarial robustness and out-of-distribution (OOD) detection. Meanwhile, we also establish the connection between IB theory and invariant causal representation learning, and observed that GIB demonstrates appealing performance when different environments arrive sequentially, a more practical scenario where invariant risk minimization (IRM) fails.

Illustration of the Deterministic Gate

Slides of the presentation at ICDM 2021

Install Instructions

pip install GIB

Reference

@inproceedings{alesiani2021gated,
  title={Gated Information Bottleneck for Generalization in Sequential Environments},
  author={Alesiani, Francesco and Yu, Shujian and Yu, Xi},
  booktitle={2021 IEEE International Conference on Data Mining (ICDM)},
  pages={1--10},
  year={2021},
  organization={IEEE}
}