Change Log
Feature
- Support
GCSAM
optimizer. (#343, #344)- Gradient Centralized Sharpness Aware Minimization
- you can use it from
SAM
optimizer by settinguse_gc=True
.
- Support
LookSAM
optimizer. (#343, #344)
Update
- Support alternative precision training for
Shampoo
optimizer. (#339) - Add more features to and tune
Ranger25
optimizer. (#340)AGC
+Lookahead
variants- change default beta1, beta2 to 0.95 and 0.98 respectively
- Skip adding
Lookahead
wrapper in case ofRanger*
optimizers, which already have it increate_optimizer()
. (#340) - Improved optimizer visualization. (#345)
- Rename
pytorch_optimizer.optimizer.gc
topytorch_optimizer.optimizer.gradient_centralization
to avoid possible conflict with Python built-in functiongc
. (#349)
Bug
Docs
- Update the visualizations. (#340)
Contributions
thanks to @AidinHamedi