Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在模型下载前,提示用户相关模型参数的限制,避免下载后模型无法使用。 #1784

Closed
nikelius opened this issue Jul 4, 2024 · 2 comments
Labels
Milestone

Comments

@nikelius
Copy link
Contributor

nikelius commented Jul 4, 2024

Is your feature request related to a problem? Please describe

环境:Linux,CPU模式(无GPU,无cuda)
问题:模型下载完成后,依然无法使用(不在界面中显示,也无法删除),出现类似如下的错误
ValueError: [address=0.0.0.0:41647, pid=166121] AWQ is only available on GPU
ValueError: [address=0.0.0.0:42784, pid=165747] Only 8-bit quantization is supported if it is not linux system or cuda device

Describe the solution you'd like

建议:

  • 在模型下载前,加强这模型参数限制方面的提示,避免用户下载时模型参数选择错误,导致下载后不可用。
  • 增加显示正在下载任务;用户可随时取消或删除。

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.

Additional context

@XprobeBot XprobeBot added the gpu label Jul 4, 2024
@XprobeBot XprobeBot added this to the v0.12.4 milestone Jul 4, 2024
@nikelius
Copy link
Contributor Author

nikelius commented Jul 4, 2024

类似的问题还有,下载 internvl-chat 后显示:
RuntimeError: [address=0.0.0.0:36213, pid=173672] Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

@XprobeBot XprobeBot modified the milestones: v0.13.0, v0.13.1 Jul 5, 2024
@ChengjieLi28
Copy link
Contributor

Not planned.

@ChengjieLi28 ChengjieLi28 closed this as not planned Won't fix, can't repro, duplicate, stale Jul 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants