[bug] Failed to generate AI response.
describe the bug
I successfully ran llama 3.2:3b-direct-q4_K_M using ollama on a Windows system, but encountered this error while using AI.
expected behavior Obtain AI response
system info windows 10
- screenpipe version:0.2.4
@blackcat-meow can you right click inspect in the background of the app and share the logs?
@louis030195 The model I am using is llama3.2:3b-direct-q4_K_M, and this is the error I encountered:
It is possible to have normal conversations in cmd:
This is my test result in Postman:
did you set the environment variable?
this is because ollama blocks tauri