stylegan2-pytorch icon indicating copy to clipboard operation
stylegan2-pytorch copied to clipboard

Working code for CPU inference?

Open JohnnyRacer opened this issue 2 years ago • 4 comments

Hi, I want to run inference on the CPU on Colab but I think I will have issues with the .cu CUDA extensions since there is no nvcc avaliable on the VM. Is there a way to modify the code so it runs on the CPU only without nvcc?

JohnnyRacer avatar Mar 30 '22 09:03 JohnnyRacer

Native pytorch implementation of operators will be used when model is run on cpu.

rosinality avatar Mar 31 '22 15:03 rosinality

did you found a solution? having the same problem here

aniritafc avatar Sep 28 '23 15:09 aniritafc

did you found a solution? having the same problem here

Have you tried changing the imports to the .py versions of the kernels? By default they are loading from the CUDA extensions which requires nvcc and CUDA support.

JohnnyRacer avatar Sep 28 '23 18:09 JohnnyRacer

Yes, already working! Thanks

aniritafc avatar Sep 29 '23 13:09 aniritafc