stable-diffusion-nvidia-docker icon indicating copy to clipboard operation
stable-diffusion-nvidia-docker copied to clipboard

Tesla GPUs needs vGPU license to pass through to Docker

Open angelmankel opened this issue 3 years ago • 7 comments

Hi, I recently got two Tesla P40 GPUs which I was hoping to use with this. From my understanding, the Tesla P40s need the vGPU license in order to pass through via WSL. I am using my tesla cards locally for other applications as well and basically use this as a graphics/machine learning server running windows 11 so I don't really want to install Linux on the PC itself.

Do you see any easy way to run this without docker? Hopefully I'm wrong about the licensing. I tried to export the container run the scripts locally but I honestly don't know what i'm doing with that and didn't make much progress.

angelmankel avatar Oct 24 '22 17:10 angelmankel

Mmm I don't know of any particular license between the P40s and the WSL but I am not a windows user, maybe someone else can clarify this. I think windows11 should have a more integrated support for wsl. Can you use commands like nvidia-smi by opening up a console or powershell?

NickLucche avatar Oct 25 '22 07:10 NickLucche

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html image

When I run nvidia-smi in PowerShell this is what I get. image

If I run nvidia-smi in wsl this is what I get. image

angelmankel avatar Oct 25 '22 16:10 angelmankel

any chance you can run docker run.. from powershell? I am not aware of the changes introduced in win11 sorry, I'll mark "help wanted" here

NickLucche avatar Oct 28 '22 07:10 NickLucche

Näyttökuva 2022-11-16 kello 22 55 53 Tesla T4 works fine.

huotarih avatar Nov 16 '22 20:11 huotarih

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html image

When I run nvidia-smi in PowerShell this is what I get. image

If I run nvidia-smi in wsl this is what I get. image

image similar problem. different error. Tesla K80 and RTX 3070

Anonym0us33 avatar Jun 20 '23 03:06 Anonym0us33

Wasn't this removed relatively recently in the newer drivers? I remember being able to run 2 VMs on my RTX 2080

vleeuwenmenno avatar Jul 05 '23 10:07 vleeuwenmenno

I have the very same set up like to know as much as possible wish there was a way to get in contact ?

helyxzion50943 avatar Oct 13 '23 16:10 helyxzion50943