MiniGPT-4 icon indicating copy to clipboard operation
MiniGPT-4 copied to clipboard

An idea to allow more users to use this locally

Open lelapin123 opened this issue 2 years ago • 3 comments

What about using vicuna is CPU mode and use GPU for what is not Vicuna related ? llama.cpp doesnt use the vram

lelapin123 avatar Apr 21 '23 05:04 lelapin123

Will be super slow, I tested stable diffusion on my 48 thread cpu and it was slower about 100x than my A4000 gpu was. It would take hours at least couple of minutes with a simple prompt. There is a smaller version which can run on 12 GB Vram

Korner83 avatar Apr 21 '23 06:04 Korner83

Alpaca Electron is very decent with 32gb of ram (not vram)

Congrats for your A4000 at $10000/ut

lelapin123 avatar Apr 21 '23 06:04 lelapin123

A4000 is cheap and costs around $900: amazon

I already sold it and bought a 4090 which is 4 times faster.

Korner83 avatar Apr 21 '23 06:04 Korner83