ComfyUI-WanVideoWrapper icon indicating copy to clipboard operation
ComfyUI-WanVideoWrapper copied to clipboard

After the upgrade, wanvideo long I2V InfiniteTalk always OOM

Open Jason-jiang-2024 opened this issue 2 months ago • 8 comments

After the upgrade, wanvideo long I2V InfiniteTalk always OOM.The same settings were all functioning properly before.

Image

Jason-jiang-2024 avatar Nov 02 '25 05:11 Jason-jiang-2024

Noticed something funky going on with 1.3.8 too.

WAN 2.2, I2V, [fp8_e4m3fn_scaled_KJ]. duration 8 seconds (129 frames), resolution 1024x896, 4-steps with speed lora, blocks to swap: 35 24Gb VRAM, 64Gb RAM.

1.3.8: unusable. OOM on first try. second try never finishes, 99% VRAM usage, 49% RAM usage. Can't wait for a single step to finish. GPU mostly idle. 1.3.6: OOM on first try, 480 seconds on second generation, 420 seconds on third and subsequent generations. 90% VRAM usage, up to 94% RAM usage. GPU at 100%, dropping to 70% from time to time, probably during block swapping.

Something going on with block swap node?

Anyway, staying with 1.3.6, till this is hopefully fixed.

firsack avatar Nov 04 '25 21:11 firsack

just git pulled and now im getting OOM in WanANimate that always worked fine... I rolled back and its fine again

protector131090 avatar Nov 05 '25 13:11 protector131090

Have same problem i think, "wanvideo_WanAnimate_preprocess_example_02.json" worked fine some weeks ago, but after update all Nodes & Nvidia driver (rtx4090) and windows updates. ComfyUI v0.3.66 is still same. i getting OOM too, because not enought vram anymore? version. 1.3.6 working best? (i remember try older version than 1.3.8 and no luck)

villevk avatar Nov 06 '25 18:11 villevk

Two things:

After any update that changes model code, torch.compile caches become invalid and there will be new recompiles, and currently it seems to be a bug or something in Windows that it increases VRAM usage, I don't get this on Linux at all. Just queueing again until there's no recompiles the memory use is back to normal, mostly recompiles still happen when using unmerged LoRAs, so to avoid that issue merging LoRAs may help.

If the compile issue persists, and also otherwise it can be helpful to clear Triton caches:

To clear your Triton cache you can delete the contents of following (default) folders:

C:\Users\<username>\.triton

C:\Users\<username>\AppData\Local\Temp\torchinductor_<username>

Another thing that's more controlled change that may cause old workflows OOM if they were tightly tuned to available VRAM, also related to unmerged LoRAs, is that previously they were always fully offloaded (inefficient), now they are part of the blocks themselves and obey the block_swap instead, in practice this means you may have to swap 1-2 more blocks, depending on how big and many LoRAs you use. Using lots of LoRAs unmerged is always gonna be somewhat troublesome though, especially if they are huge ones.

kijai avatar Nov 06 '25 19:11 kijai

clearing cache helped, thanks

firsack avatar Nov 06 '25 22:11 firsack

After the update, it causes an OOM (Out of Memory) error on Ubuntu 24.04.1 LTS RTX 3090 CUDA 12.8. The workflow is wan_animate_v2, and the video size is 480x832. It worked fine before the update.

peter4431 avatar Nov 07 '25 08:11 peter4431

Working now too! What i do: Clear triton cache, Install older nvidia driver 576.80 (i think this is no key), Update all custom nodes in workflow: wanvideo_WanAnimate_preprocess_example_02 to nightly. Still other workflow from civitai no working (what worked before, but there is more loras used). Kiitos, thanks Kijai

villevk avatar Nov 08 '25 13:11 villevk

clearing cache helped, thanks

OK, at first i thought clearing cache helped with 1.3.8. besides clearing the cache I also updated nvidia drivers. after these actions 1.3.8 worked just as good as 1.3.6. then i updated to 1.3.9 and memory management sucked again. now, no matter what i do (cache clearing, drivers reinstall), i can't get 1.3.9 to work as good as 1.3.6. AND the 1.3.8 version is also broken yet again!

staying on 1.3.6 - works great no matter what!

firsack avatar Nov 13 '25 01:11 firsack