Skip to content

tanyapohn/pytorch-cosine-annealing-with-warmup

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Cosine Annealing with Warm up for PyTorch

Example

>> model = ...
>> optimizer = optim.SGD(model.parameters(), lr=0.001, momentum=0.9, weight_decay=1e-5) # lr is min lr
>> scheduler = CosineAnnealingWarmUpRestarts(optimizer, T_0=250, T_mult=2, eta_max=0.1, T_up=50)
>> for epoch in range(n_epoch):
>>     train()
>>     valid()
>>     scheduler.step()
  • case1 : CosineAnnealingWarmUpRestarts(optimizer, T_0=150, T_mult=1, eta_max=0.1, T_up=10, gamma=0.5) example1
  • case2 : CosineAnnealingWarmUpRestarts(optimizer, T_0=50, T_mult=2, eta_max=0.1, T_up=10, gamma=0.5) example2
  • case3 : CosineAnnealingWarmUpRestarts(optimizer, T_0=100, T_mult=1, eta_max=0.1, T_up=10, gamma=0.5) example3
  • case4 : CosineAnnealingWarmUpRestarts(optimizer, T_0=250, T_mult=1, eta_max=0.1, T_up=50) example4
  • case5 : CosineAnnealingWarmUpRestarts(optimizer, T_0=250, T_mult=2, eta_max=0.1, T_up=50) example5

引数

  • T_0 : Cosine Annearingのステップ数
  • T_multi : ステップの倍率
  • eta_max : lrの最大値
  • T_up : warmupのイテレーション数
  • gamma : サイクル毎の最大学習率減少率

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%