llama-gpt
llama-gpt copied to clipboard
Mac Pro 2019 (memory usage?)
I have a Mac Pro 2019 28core Xeon with 256GB of RAM and i would like to install LlamaGPT on it. But now i see only the docker compose option for x86 cpu's. Does this mean that it is not going to use my memory?
I am sure the 28c/56t Xeon will do the job, but is there a way to get use of my memory of even my 6900XT?
Hello
Your CPU is compatible with the docker image. It should work fine and yes the container will use your RAM.
However, it won't use the GPU