Skip to content
/ DTKD Public

Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"

License

Notifications You must be signed in to change notification settings

JinYu1998/DTKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic Temperature Knowledge Distillation

The paper link is https://arxiv.org/abs/2404.12711

The code is built on mdistiller.

Framework & Performance

Different teachers distilled into ResNet8

Differernt teacher distilled into MobileNetV2

TODO

  • To update the code that records the temperature in training.
  • To update the analysis code.
  • Release other network models. (Such as ResNetXXX)

Installation

Environments:

  • Python 3.8
  • PyTorch 1.7.0

Install the package:

sudo pip3 install -r requirements.txt
sudo python3 setup.py develop

For more details please refer to https://github.com/megvii-research/mdistiller

CIFAR-100

Acknowledgement

  • Sincere gratitude to the contributors of mdistiller for your distinguished efforts.

Contact

YuKang Wei: [email protected]

About

Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published