Computation Requirement to train CogVideo
Hi,
First of all, great work in developing CogVideo. Could you please provide information on how many GPUs and how much duration it took to train the model?
Thanks Gaurav
We use 13*8 A100 to train the model. The two stages were trained for ~100k iterations in total, which took ~20 days.
How much GPU memory and how much duration it took to infer the model? I have a RTX3090Ti, but OOM occurred.
Great work! Could you please help me with below issue? How much GPU memory is required for Inferencing? With a 15 GB GPU, I tried it, but a runtime problem appeared.
It takes around 25GB GPU memory to inference with batchsize=1 (on our A100).
hi, what's the runtime problem?