Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump torch from 2.3.1 to 2.4.0 #255

Closed
wants to merge 1 commit into from
Closed

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jul 25, 2024

Bumps torch from 2.3.1 to 2.4.0.

Release notes

Sourced from torch's releases.

PyTorch 2.4: Python 3.12, AOTInductor freezing, libuv backend for TCPStore

PyTorch 2.4 Release Notes

Highlights

We are excited to announce the release of PyTorch® 2.4! PyTorch 2.4 adds support for the latest version of Python (3.12) for torch.compile. AOTInductor freezing gives developers running AOTInductor more performance based optimizations by allowing the serialization of MKLDNN weights. As well, a new default TCPStore server backend utilizing libuv has been introduced which should significantly reduce initialization times for users running large-scale jobs. Finally, a new Python Custom Operator API makes it easier than before to integrate custom kernels into PyTorch, especially for torch.compile.

This release is composed of 3661 commits and 475 contributors since PyTorch 2.3. We want to sincerely thank our dedicated community for your contributions. As always, we encourage you to try these out and report any issues as we improve 2.4.

Tracked Regressions

Subproc exception with torch.compile and onnxruntime-training

There is a reported issue when using torch.compile if onnxruntime-training lib is installed. The issue will be fixed in v2.4.1. It can be solved locally by setting the environment variable TORCHINDUCTOR_WORKER_START=fork before executing the script.

cu118 wheels will not work with pre-cuda12 drivers

It was also reported that the new version of triton uses cuda features that are not compatible with pre-cuda12 drivers. In this case, the workaround is to set TRITON_PTXAS_PATH manually as follows (adapt the code according to the local installation path):

TRITON_PTXAS_PATH=/usr/local/lib/python3.10/site-packages/torch/bin/ptxas  python script.py

Backwards Incompatible Change

Python frontend

Default TreadPool size to number of physical cores (#125963)

Changed the default number of threads used for intra-op parallelism from the number of logical cores to the number of

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [torch](https://github.com/pytorch/pytorch) from 2.3.1 to 2.4.0.
- [Release notes](https://github.com/pytorch/pytorch/releases)
- [Changelog](https://github.com/pytorch/pytorch/blob/main/RELEASE.md)
- [Commits](pytorch/pytorch@v2.3.1...v2.4.0)

---
updated-dependencies:
- dependency-name: torch
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jul 25, 2024
Copy link

🦙 MegaLinter status: ✅ SUCCESS

Descriptor Linter Files Fixed Errors Elapsed time
✅ ACTION actionlint 4 0 0.24s
✅ COPYPASTE jscpd yes no 12.58s
✅ JSON prettier 2 0 0 0.44s
✅ JSON v8r 2 0 3.26s
✅ MARKDOWN markdownlint 1 0 0 0.38s
✅ MARKDOWN markdown-link-check 1 0 1.04s
✅ MARKDOWN markdown-table-formatter 1 0 0 0.39s
✅ PYTHON bandit 231 0 6.81s
✅ PYTHON black 231 0 0 4.86s
✅ PYTHON flake8 231 0 3.08s
✅ PYTHON isort 231 0 0 0.77s
✅ PYTHON mypy 231 0 6.44s
✅ PYTHON pylint 231 0 59.69s
✅ PYTHON ruff 231 0 0 0.07s
✅ REPOSITORY checkov yes no 16.62s
✅ REPOSITORY gitleaks yes no 2.43s
✅ REPOSITORY git_diff yes no 0.02s
✅ REPOSITORY grype yes no 19.69s
✅ REPOSITORY secretlint yes no 4.97s
✅ REPOSITORY trivy-sbom yes no 1.4s
✅ REPOSITORY trufflehog yes no 21.2s
✅ YAML prettier 8 0 0 0.7s
✅ YAML v8r 8 0 8.21s
✅ YAML yamllint 8 0 0.34s

See detailed report in MegaLinter reports

MegaLinter is graciously provided by OX Security

Copy link

Test Results

    3 files  ±0      3 suites  ±0   55m 17s ⏱️ +40s
  415 tests ±0    415 ✅ ±0    0 💤 ±0  0 ❌ ±0 
1 245 runs  ±0  1 047 ✅ ±0  198 💤 ±0  0 ❌ ±0 

Results for commit ad6be6e. ± Comparison against base commit 47d9d37.

@ISAS-Admin ISAS-Admin closed this Jul 25, 2024
@ISAS-Admin ISAS-Admin deleted the dependabot/pip/torch-2.4.0 branch July 25, 2024 22:15
Copy link
Contributor Author

dependabot bot commented on behalf of github Jul 25, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant