stylegan2-pytorch
stylegan2-pytorch copied to clipboard
Working code for CPU inference?
Hi, I want to run inference on the CPU on Colab but I think I will have issues with the .cu CUDA extensions since there is no nvcc avaliable on the VM. Is there a way to modify the code so it runs on the CPU only without nvcc?
Native pytorch implementation of operators will be used when model is run on cpu.
did you found a solution? having the same problem here
did you found a solution? having the same problem here
Have you tried changing the imports to the .py
versions of the kernels? By default they are loading from the CUDA extensions which requires nvcc
and CUDA support.
Yes, already working! Thanks