pytorch-optimizer v2.11.2
Change Log
Feature
- Implement DAdaptLion optimizer (#203)
Fix
- Fix Lookahead optimizer (#200, #201, #202)
- When using PyTorch Lightning which expects your optimiser to be a subclass of
Optimizer
.
- When using PyTorch Lightning which expects your optimiser to be a subclass of
- Fix default
rectify
toFalse
inAdaBelief
optimizer (#203)
Test
- Add
DynamicLossScaler
test case
Docs
- Highlight the code blocks
- Fix pepy badges
Contributions
thanks to @georg-wolflein