llama-gpt icon indicating copy to clipboard operation
llama-gpt copied to clipboard

Mac Pro 2019 (memory usage?)

Open Yzord opened this issue 1 year ago • 1 comments

I have a Mac Pro 2019 28core Xeon with 256GB of RAM and i would like to install LlamaGPT on it. But now i see only the docker compose option for x86 cpu's. Does this mean that it is not going to use my memory?

I am sure the 28c/56t Xeon will do the job, but is there a way to get use of my memory of even my 6900XT?

Yzord avatar Aug 22 '23 13:08 Yzord

Hello

Your CPU is compatible with the docker image. It should work fine and yes the container will use your RAM.

However, it won't use the GPU

redoules avatar Aug 24 '23 04:08 redoules