Skip to content

repository for Universal Domain Adaptation through Self-supervision

License

Notifications You must be signed in to change notification settings

szubing/DANCE

 
 

Repository files navigation

This repository provides code for the paper, Universal Domain Adaptation through Self-Supervision. Please go to our project page to quickly understand the content of the paper or read our paper.

Environment

Python 3.6.9, Pytorch 1.2.0, Torch Vision 0.4, Apex. See requirement.txt. We used the nvidia apex library for memory efficient high-speed training.

Data Preparation

Office Dataset OfficeHome Dataset VisDA

Prepare dataset in data directory as follows.

./data/amazon/images/ ## Office
./data/Real/ ## OfficeHome
./data/visda_train/ ## VisDA synthetic images
./data/visda_val/ ## VisDA real images

Prepare image list.

unzip txt.zip

File list has to be stored in ./txt.

Train

All training script is stored in script directory.

Example: Open Set Domain Adaptation on Office.

sh script/run_office_obda.sh $gpu-id configs/office-train-config_ODA.yaml

Reference

This repository is contributed by Kuniaki Saito. If you consider using this code or its derivatives, please consider citing:

@inproceedings{saito2020dance,
  title={Universal Domain Adaptation through Self-Supervision},
  author={Saito, Kuniaki and Kim, Donghyun and Sclaroff, Stan and Saenko, Kate},
  journal={NeurIPS},
  year={2020}
}

About

repository for Universal Domain Adaptation through Self-supervision

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 82.5%
  • Shell 17.5%