AGiXT
AGiXT copied to clipboard
LlamaCCP Models Nonfunctional
Hi there
I tried three different models (vicuna, gpt4all and gpt4all-j), all are failing to load. The vicuna one with some strange tensor error, and the gpt4all ones he wants to convert. Do you have a working config to use a llamacpp model?
Thanks
Hello! Our llamacpp integration is the gpt4all python module. Any model that directly works in gpt4all should also directly work in Agent-LLM.
You can use ggml-gpt4all-l13b-snoozy. I don't think it will work but the model loads.
Can we please see "some strange tensor error" and the prompt to convert, as well as your Llama related env settings?
Are you running in a local, remote or containerized environment?