Gazenvagin
Results
2
comments of
Gazenvagin
``` Total VRAM 23028 MB, total RAM 65228 MB pytorch version: 2.9.1+cu128 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync Enabled pinned memory 29352.0 working...
> Someone should post a sample workflow so devs can test and see if that problem persists. [Wan 2.2 Animate (Rutube).json](https://github.com/user-attachments/files/23763323/Wan.2.2.Animate.Rutube.json) [WAN 2.2-I2V-(Kijai Wrapper).json](https://github.com/user-attachments/files/23763326/WAN.2.2-I2V-.Kijai.Wrapper.json)