pytorch-optimizer v3.0.0
Change Log
The major version is updated! (v2.12.0
-> v3.0.0
) (#164)
Many optimizers, learning rate schedulers, and objective functions are in pytorch-optimizer
.
Currently, pytorch-optimizer
supports 67 optimizers (+ bitsandbytes
), 11 lr schedulers, and 13 loss functions, and reached about 4 ~ 50K downloads / month (peak is 75K downloads / month)!
The reason for updating the major version from v2
to v3
is that I think it's a good time to ship the recent implementations (the last update was about 7 months ago) and plan to pivot to new concepts like training utilities while maintaining the original features (e.g. optimizers).
Also, rich test cases, benchmarks, and examples are on the list!
Finally, thanks for using the pytorch-optimizer
, and feel free to make any requests :)
Feature
- Implement
REX
lr scheduler. (#217, #222) - Implement
Aida
optimizer. (#220, #221) - Implement
WSAM
optimizer. (#213, #216) - Implement
GaLore
optimizer. (#224, #228) - Implement
Adalite
optimizer. (#225, #229) - Implement
bSAM
optimizer. (#212, #233) - Implement
Schedule-Free
optimizer. (#230, #233) - Implement
EMCMC
. (#231, #233)
Fix
- Fix SRMM to allow operation beyond memory_length. (#227)
Dependency
- Drop
Python 3.7
support officially. (#221)- Please check the README.
- Update
bitsandbytes
to0.43.0
. (#228)
Docs
- Add missing parameters in
Ranger21 optimizer
document. (#214, #215) - Fix
WSAM
optimizer paper link. (#219)
Contributions
Diff
- from the previous major version : 2.0.0...3.0.0
- from the previous version: 2.12.0...3.0.0