extended_openai_conversation
extended_openai_conversation copied to clipboard
Running LocalAI, but entities don't respond
I am running LocalAI on my mac m2. Build the project locally and run the following model https://huggingface.co/TheBloke/Luna-AI-Llama2-Uncensored-GGUF Model version : luna-ai-llama2-uncensored.Q4_K_M.gguf
And used this as the template file luna-ai-llama2-uncensored.Q4_K_M.gguf.tmpl
for the model. ie.
USER: {{.input}}
ASSISTANT:
inside the model folder so that the LocalAI can serve this model.
Connection with Home Assistant
I connected the API using your integration
and I did not change anything in the prompt template.
Just used the default prompt template and only specified the model name luna-ai-llama2-uncensored.Q4_K_M.gguf
I created the pipeline and used assist to tell me what lights are there in the house. this is what it says.
What am I doing wrong? Is the model that I used is wrong?
Because the same prompt with OpenAI API works all fine.
This is the output from OpenAI API
And it is working properly with OpenAI APi.
can you tell me if the model that I am using or some parameter that I am using is wrong? Any direction is really appreciated.
I don't think the model you are using knows how to use home assistant I have been trying with acon96/Home-3B-v2-GGUF. he also has his own home assistant integration as well https://github.com/acon96/home-llm
@markmghali I tried his plugin and model also. but dint get much luck. it was not very reliable. How is your experience with it?
I had it working pretty well! But then I broke it in the process of messing around with it. I need to figure out what I did wrong. I also don't have a GPU so that slows me down
i mentioned this https://github.com/jekalmin/extended_openai_conversation/issues/103#issuecomment-1928104088