Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docker.io/runpod/worker-vllm:stable-cuda12.1.0 #18003

Closed
linkdao opened this issue Jul 3, 2024 · 3 comments
Closed

docker.io/runpod/worker-vllm:stable-cuda12.1.0 #18003

linkdao opened this issue Jul 3, 2024 · 3 comments

Comments

@linkdao
Copy link

linkdao commented Jul 3, 2024

IMAGE SYNC

Copy link
Contributor

github-actions bot commented Jul 3, 2024

Hi @linkdao,
感谢您的反馈!
我们会尽快跟进.

Details

Instructions for interacting with me using comments are available here.
If you have questions or suggestions related to my behavior, please file an issue against the gh-ci-bot repository.

Copy link
Contributor

github-actions bot commented Jul 3, 2024

镜像 docker.io/runpod/worker-vllm:stable-cuda12.1.0 不在白名单列表里, 不支持同步和访问
可以将其添加到白名单

Copy link
Contributor

github-actions bot commented Jul 3, 2024

镜像 同步失败详情请查看
如有疑问请回复 /auto-cc 召唤帮助

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant