Run i-Code-v3 on CPU, Solve GPU VRAM problem!
I will appreciate it to provide the support or apply the following changes to be able run the i-Code-v3 on CPU.
the following changes need to apply: /core/common/utils.py --> change np.int to np.int32
in the following files: /core/models/model_module_infer.py /core/models/ddim/ddim.py /core/models/latent_diffusion/diffusion_unet.py
should add the following code: device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
and change ".cuda( )" to ".to(device)"
by applying the above changes I could run the i-Code-V3 on the CPU, but fp16 was not supported yet (on CPU mode). I appreciate it to support Running FP32 and FP16 on CPU.
Thanks! We will work on adding fp16 support on CPU. Currerntly the model can run on fp16 GPU but can't on fp16 CPU?
Thanks! We will work on adding fp16 support on CPU. Currerntly the model can run on fp16 GPU but can't on fp16 CPU?
The code worked well on fp16 GPU (RTX 3090).
For the CPU the code does not work. because in the files that I mensioned before, "model_module_infer.py", "ddim.py", and "diffusion_unet.py", the variables set to GPU by ".cuda()" command. To fix it I suggested to change it to ".to(device)" and define device variable by either checking 'cuda' if it is available (if not "CPU") or by user decision to run on cpu/cuda.
I already made those changes locally and code worked on my CPU without problem in normal mode (fp32). But when I switch to fp16 and run on CPU, the code does not work.