Skip to content

vcl-iisc/data-efficient-certified-robustness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DE-CROP: Data-efficient Certified Robustness for Pretrained Classifiers (WACV 2023) - Official Implementation

Paper Link: https://arxiv.org/pdf/2210.08929.pdf

Project Webpage: https://sites.google.com/view/decrop


Method Overview

technique overview


Training & Evaluation

  1. Pre-train denoiser using $L_{lc}$, $L_{cs}$, $L_{mmd}$ losses: pretrain_denoiser_decrop.sh
  2. Train denoiser with additional $L_{dd}$ loss along with Gradient Reversal Layer (initialized using pre-trained weights in 1.): train_denoiser_decrop.sh
  3. Certify denoiser: certify_denoiser_decrop.sh
  4. Visualize Results: python code/graph_decrop.py

Citation

If you use this code, please cite our work as:

@inproceedings{
    nayak2023_DECROP,
    title={DE-CROP: Data-efficient Certified Robustness for Pretrained Classifiers},
    author={Nayak, G. K., Rawal, R., and Chakraborty, A.},
    booktitle={IEEE Winter Conference on Applications of 
    Computer Vision},
    year={2023}
}

Acknowledgements

This repo is adapted from Salman et al. 2020 and utilizes utility functions from torchattacks library for generating boundary and interpolated samples.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published