Validation is very slow during LoRA fine-tuning
I noticed that i2v inference needs more than 20 minutes during validation, but when I run cli_demo.py it's only need 6 minutes. I don't think that's normal, is there any solution for this?
The amount of reasoning time depends on the model you are using, the GPU, and the resolution. I need you to provide these details so that I can answer better.
The amount of reasoning time depends on the model you are using, the GPU, and the resolution. I need you to provide these details so that I can answer better.
Hi, @zRzRzRzRzRzRzR Thanks for your reply, the model I used is CogVideoX-5B, the resolution is 49x480x720 as recommended, the GPU is H100. Howerver, I don't think it had anything to do with my settings.
During the validation process in training, inference may take longer due to the use of zero-3.