Skip to content

Commit

Permalink
Merge pull request #285 from kozistr/release/v3.2.0
Browse files Browse the repository at this point in the history
[Release] v3.2.0
  • Loading branch information
kozistr authored Oct 28, 2024
2 parents 1a4896f + 225014f commit a59f2e1
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/changelogs/v3.2.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
* `torchao_adamw8bit`, `torchao_adamw4bit`, `torchao_adamwfp8`.
* Support a module-name-level (e.g. `LayerNorm`) weight decay exclusion for `get_optimizer_parameters`. (#282, #283)
* Implement `CPUOffloadOptimizer`, which offloads optimizer to CPU for single-GPU training. (#284)
* Support a regex-based filter for searching names of optimizers, lr schedulers, and loss functions. (#285)
* Support a regex-based filter for searching names of optimizers, lr schedulers, and loss functions.

### Bug

Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "pytorch_optimizer"
version = "3.1.2"
version = "3.2.0"
description = "optimizer & lr scheduler & objective function collections in PyTorch"
license = "Apache-2.0"
authors = ["kozistr <[email protected]>"]
Expand Down Expand Up @@ -35,6 +35,7 @@ classifiers = [
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Operating System :: OS Independent",
"Topic :: Scientific/Engineering",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
Expand Down

0 comments on commit a59f2e1

Please sign in to comment.