StreamingT2V icon indicating copy to clipboard operation
StreamingT2V copied to clipboard

how to run inference on multi GPUs

Open GallonDeng opened this issue 1 year ago • 1 comments

how to run inference on multi GPUs, such as RTX4090, since it needs much more 24G?

GallonDeng avatar Aug 31 '24 10:08 GallonDeng

HI @AllenDun, thank you for your interest in our project.

There is currently no multi GPU implementation. We are working on reducing the memory requirements.

rob-hen avatar Aug 31 '24 12:08 rob-hen