Open-Sora
Open-Sora copied to clipboard
May I inquire whether the 4090 can participate in this project? For instance, in inference tasks?
I have the same problem
me too
me too, can inference on 4090?
我用一块3090,可以跑16x256x256和16x512x512的,batchsize设成1
can you share your environments config?
---Original--- From: "Haoming @.> Date: Tue, Mar 19, 2024 15:51 PM To: @.>; Cc: @.@.>; Subject: Re: [hpcaitech/Open-Sora] May I inquire whether the 4090 can participate in this project? For instance, in inference tasks? (Issue #104)
我用一块3090,可以跑16x256x256的
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
@tanghaom 16x512x512的好像不行吧?把num_frames改成4倒是可以,但效果惨不忍睹
我用一块3090,可以跑16x256x256和16x512x512的,batchsize设成1
@tanghaom 16x512x512的好像不行吧?把num_frames改成4倒是可以,但效果惨不忍睹
我用一块3090,可以跑16x256x256和16x512x512的,batchsize设成1
你把vae那个步骤单独跑就行了,前面的生成部分24G显存够用; 你可以拆开跑,比如前面模型用一块卡,vae用另一块卡;或者vae用cpu;或者去掉生成的模型,vae在单独的流程里跑。
https://github.com/chaojie/ComfyUI-Open-Sora 可以用我这个项目,4090可以跑16x512x512的