StreamingT2V icon indicating copy to clipboard operation
StreamingT2V copied to clipboard

how to run inference on multi GPUs

Open GallonDeng opened this issue 5 months ago • 1 comments

how to run inference on multi GPUs, such as RTX4090, since it needs much more 24G?

GallonDeng avatar Aug 31 '24 10:08 GallonDeng