We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
容器镜像中的transformers版本是4.44.2
xinference v1.0.0、xinference v0.16.3
docker run --name xinference -d -p 9997:9997 --restart always -e XINFERENCE_HOME=/data -e XINFERENCE_MODEL_SRC=modelscope -v xinference_data:/data --gpus all xprobe/xinference:latest xinference-local -H 0.0.0.0
启动中报错:Server error. 500 - [address=0.0.0.0.44775, pid=952] cannot import name 'Qwen2VLForconditionalGeneration' from "transformers' (/usr/local/lib/python3.10/distpackages/transformers/ init .py)
xinference v1.0.0及以下版本需要升级transformer 4.45或以上,才可以正常启动qwen2-VL-instruct 模型。建议容器对镜像中的transformer版本升级到4.45以上,当前是4.46可正常运行
The text was updated successfully, but these errors were encountered:
欢迎提 PR 修改:
inference/xinference/deploy/docker/requirements.txt
Line 28 in f2b22bb
Sorry, something went wrong.
No branches or pull requests
System Info / 系統信息
容器镜像中的transformers版本是4.44.2
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
xinference v1.0.0、xinference v0.16.3
The command used to start Xinference / 用以启动 xinference 的命令
docker run --name xinference -d -p 9997:9997 --restart always -e XINFERENCE_HOME=/data -e XINFERENCE_MODEL_SRC=modelscope -v xinference_data:/data --gpus all xprobe/xinference:latest xinference-local -H 0.0.0.0
Reproduction / 复现过程
启动中报错:Server error. 500 - [address=0.0.0.0.44775, pid=952] cannot import name 'Qwen2VLForconditionalGeneration'
from "transformers' (/usr/local/lib/python3.10/distpackages/transformers/ init .py)
Expected behavior / 期待表现
xinference v1.0.0及以下版本需要升级transformer 4.45或以上,才可以正常启动qwen2-VL-instruct 模型。建议容器对镜像中的transformer版本升级到4.45以上,当前是4.46可正常运行
The text was updated successfully, but these errors were encountered: