Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider providing vGPU guest drivers removing the need to build and host one's own #41

Open
frittentheke opened this issue Jun 28, 2024 · 0 comments

Comments

@frittentheke
Copy link

If one is using vGPUs it's currently required to manually download vGPU guest drivers and to build a custom image.
While the steps are explained at https://docs.nvidia.com/datacenter/cloud-native/gpu-operator/latest/install-gpu-operator-vgpu.html#using-nvidia-vgpu this creates quite some friction in starting to use vGPUs on K8s and also is a constant burden for driver updates.

I understand there are some licensing issues (EULA) that does not allow to host vGPU guest drivers publicly:

Uploading the NVIDIA vGPU driver to a publicly available repository or otherwise publicly sharing the driver is a violation of the NVIDIA vGPU EULA.

But the driver in itself does not do much without individual client configuration token. So I am wondering if there really is no path to also provide current vGPU drives in the gpu-driver-container images?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant