jiaqizhang123-stack
jiaqizhang123-stack
Is it necessary to train on a larger GPU? This code cannot train on a smaller GPU
OK, thank you very much
 The pyinstaller I installed is already the latest version
Download the. whl file and install it using pip
 nvidia-smi does not show any errors
> Hmm, if you freeze the following example > > ```python > import torch > print("torch.cuda.is_available:", torch.cuda.is_available()) > print("torch.cuda.device_count:", torch.cuda.device_count()) > ``` > > does it work? Or does `cuda.is_available`...
When the driver versions of two computers are the same, torch.cuda.is_available() returns True. Why does the driver version affect the use of CUDA? Is this related to the incomplete package...
 When I test on GTX 4090, I have this problem again, checking the files inside onedir mode, there is no libcudnn ops infer.so.8. library in it, on gtx1650, it...

