jukebox
jukebox copied to clipboard
Would the 5b_lyrics model run on a RTX 3090?
RTX 3090 has 24gb memory. This would be enough to run the 5b model, correct?
I don't think it will - not even 2 cards will help. Seems like it's in the realm of rtx 8000 exclusively. happy to be corrected here. https://github.com/openai/jukebox/issues/142
It runs (inference) on my 3090 without issue, haven't tried training though.
keep in mind pytorch hasn't been updated to the new cuda yet so you have to compile it yourself
It runs (inference) on my 3090 without issue, haven't tried training though.
keep in mind pytorch hasn't been updated to the new cuda yet so you have to compile it yourself
Good to know, thanks for sharing! And please let me know if training works, if you attempt it.
Also worth mentioning my system memory (32gb) can just barely handle it. During init, it fills up fully and has to swap about 2gb, but once its loaded that goes down to about 20gb usage
One more question @AeroScripts - how fast is it? For example, how long does it take to generate 20 seconds of music?
One more question @AeroScripts - how fast is it? For example, how long does it take to generate 20 seconds of music?
I can say that on the google colab version it takes about a minute for a second not upsampled. It takes 3-5 hours for 60 seconds for 1 file upsampled.
The RTX 3090 is still really good but it will really not go that fast as the GPU's google are using.
@Randy1435 @AeroScripts Have you managed to train a Small Prior using the RTX 3090?
I have been trying but get a cuDNN error (NOT an OOM); I'd really appreciate it if you could pass on a fix for it? Thanks!
Any update on this?