Skip to content

pytorch-optimizer v3.0.0

Compare
Choose a tag to compare
@kozistr kozistr released this 21 May 09:02
eda736f

Change Log

The major version is updated! (v2.12.0 -> v3.0.0) (#164)

Many optimizers, learning rate schedulers, and objective functions are in pytorch-optimizer.
Currently, pytorch-optimizer supports 67 optimizers (+ bitsandbytes), 11 lr schedulers, and 13 loss functions, and reached about 4 ~ 50K downloads / month (peak is 75K downloads / month)!

The reason for updating the major version from v2 to v3 is that I think it's a good time to ship the recent implementations (the last update was about 7 months ago) and plan to pivot to new concepts like training utilities while maintaining the original features (e.g. optimizers).
Also, rich test cases, benchmarks, and examples are on the list!

Finally, thanks for using the pytorch-optimizer, and feel free to make any requests :)

Feature

Fix

  • Fix SRMM to allow operation beyond memory_length. (#227)

Dependency

  • Drop Python 3.7 support officially. (#221)
  • Update bitsandbytes to 0.43.0. (#228)

Docs

  • Add missing parameters in Ranger21 optimizer document. (#214, #215)
  • Fix WSAM optimizer paper link. (#219)

Contributions

thanks to @sdbds, @i404788

Diff