visual-chatgpt icon indicating copy to clipboard operation
visual-chatgpt copied to clipboard

Share GPU assumptions

Open sradc opened this issue 2 years ago • 2 comments
trafficstars

Hey, it looks like this assumes that there are 8 GPUs available. Are you able to provide a bit more info about that? (I.e. what GPUs do you run this on, and recommend running on?)

(Maybe worth adding some info on this in the readme?)

sradc avatar Mar 10 '23 10:03 sradc

Anything goes, so long as you have 70 GB of VRAM... lol

I found this fork to be super useful (which removes a lot of the models that would otherwise just give you an OOM - but at least you can use some while chatting to ChatGPT and having it create images for you!):

https://github.com/rupeshs/visual-chatgpt/tree/add-colab-support

It's good for toying around with this proof-of-concept, but you either need to pay for some cloud compute deluxe or be living in a small server room to enjoy the whole thing.

Good thing they said there will be an API soon "in a few days". Together with GPT-4 rumors turning into "next week", I guess I'll settle playing with what fits in my VRAM and then try to get my hands on the API. :-)

b2zer avatar Mar 11 '23 01:03 b2zer

I did get this running in the end, on 8x NVIDIA A100 40 Gb, but various bugs prevented it from fully working (for one, the masking for inpainting wasn't working, not sure if the fork uses/fixes this model?).

Anything goes, so long as you have 70 GB of VRAM...

Not quite, I tried running it on 8x NVIDIA Tesla V100, 16 Gb. But got OOM on one of the cards when trying to generate an image. I.e. the cards need to be big enough to run the models allocated to them.

...Looking forward to the multimodal APIs coming soon, as you say.

sradc avatar Mar 11 '23 07:03 sradc

looks like this info is now in the readme.

sradc avatar Mar 18 '23 11:03 sradc