private-chatbot-mpt30b-langchain
private-chatbot-mpt30b-langchain copied to clipboard
Testing in a macbook pro
Thanks for sharing the model, I have been able to test it on my macbook pro, i9 with 32 GB of ram. I notice that the cpu goes to 400% when inferring the answer, and the gpu goes to 0%. Is it possible to make the model use the gpu? Radeon Pro Vega 20 4GB