DE-CROP: Data-efficient Certified Robustness for Pretrained Classifiers (WACV 2023) - Official Implementation
Paper Link: https://arxiv.org/pdf/2210.08929.pdf
Project Webpage: https://sites.google.com/view/decrop
- Pre-train denoiser using
$L_{lc}$ ,$L_{cs}$ ,$L_{mmd}$ losses:pretrain_denoiser_decrop.sh
- Train denoiser with additional
$L_{dd}$ loss along with Gradient Reversal Layer (initialized using pre-trained weights in 1.):train_denoiser_decrop.sh
- Certify denoiser:
certify_denoiser_decrop.sh
- Visualize Results:
python code/graph_decrop.py
If you use this code, please cite our work as:
@inproceedings{
nayak2023_DECROP,
title={DE-CROP: Data-efficient Certified Robustness for Pretrained Classifiers},
author={Nayak, G. K., Rawal, R., and Chakraborty, A.},
booktitle={IEEE Winter Conference on Applications of
Computer Vision},
year={2023}
}
This repo is adapted from Salman et al. 2020 and utilizes utility functions from torchattacks library for generating boundary and interpolated samples.