Alok Saboo
Alok Saboo
Looks like this is a global variable and not model specific. While this is a welcome change, I'm not sure it addresses the issue of being able to adjust context...
It would be great if we can save the prompt. I am using a custom prompt to merge Beam results and I would like to reuse this
@momokrono I'm using Ollama
@miurla is this on the horizon? This would be a great addition.
@zaidmukaddam I am still not sure how to integrate Ollama using the Ollama-AI-provider. There is no corresponding env variable
Please add the logs (instead of attaching them) in a code block.
@IO101010 Although I am assuming this won't preserve your history.
Here's what worked for me (this retained the history). - In your docker-compose.yml, temporarily modify the volume mount to point to a different directory (e.g., `./temp_scan`:/music:ro`). - Run the scan/container...
I am also seeing those errors. @OiAnthony I don't have those two environmental variables added.
@joebnb Refly looks good...with Ollama and configurable API support. I hope Affine adds those as well soon.