aminesoulaymani
aminesoulaymani
I got this bug when using the --single-gpu flag. I juste added : opt.device = torch.cuda.current_device() I got another bug about local_rank, I just added: opt.distributed = False
you forgot to update ppgan, i guess. make sure you uninstalled the previous version (maybe located in site-packages folder) or update it
I got it to work on windows, but the VRAM consumption is very weird, with a huge peak right after the generation process, and the results are not that good,...
this is so nice from you
@Anandyyyt the videos shown are always cherry-picked, you have to carefully choose a good input picture, then hope to have good results after a few hours of tweaking, true story....
guys, use the automatic1111 extension, I have a GEFORCE 3060 6GB VRAM, 32GB ram, amazing results, no OOM at all. You don't even have to check "low vram" setting in...