[Bug]: ollama config bug
What happened?
I was doing THIS,
fabric --listmodels --remoteOllamaServer http://127.0.0.1:11434
when THAT happened.
Error: Client error '404 Not Found' for url 'http://127.0.0.1:11434/v1/models'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404
I was expecting http://127.0.0.1:11434/api/tags to be called instead?
fabric:1.2.0 ollama: 0.1.41
Version check
- [X] Yes I was.
Relevant log output
No response
Relevant screenshots (optional)
No response
ollama: 0.1.43 the same
You probably need the full path. This works for me
export OPENAI_BASE_URL=http://localhost:11343/api/
Same issue for me
export OPENAI_BASE_URL=http://localhost:11343/api/
This works. Thank you
Nice, I will be able to run Fabric with Ollama :) I will integrate it with https://gist.github.com/marcellodesales/7be67c13d6799628dcb6954155fbd765
I'm running the ollama podman container version 0.1.48 and getting the same error: Error: Client error '404 Not Found' for url 'http://127.0.0.1:11434/v1/models'
I'm a little confused about how other have this working with ollama 0.1.48 and below as the models endpoint only appears to have been committed 6 days ago (here https://github.com/ollama/ollama/commit/996bb1b85e0c1b3ae64246a50ea412dc2a2e30d8 ) and maybe in 0.1.49 rc's. How did people have this working before now? I must be missing something.
I'm trying the suggestion export OPENAI_BASE_URL=http://localhost:11343/api/ but fabric for some reason keeps trying:
$ fabric
Error: Client error '404 Not Found' for url 'http://localhost:11434/v1/models'
I must be missing something.
@Im0 try this https://github.com/danielmiessler/fabric/issues/373#issuecomment-2178805248
I posted a fix to a different comment, but then found this one and assumed everyone was getting their problems sorted by adding /api so deleted it.
Anyway, what I discovered w/ netcat -l 11434:
Whenever OPENAI_API_KEY has an assignment fabric requests:
GET /models HTTP/1.1
But when it's commented out:
GET /api/tags
The easiest fix for me was to delete everything in the .env file except my YOUTUBE_API_KEY and setting my DEFAULT_MODEL. Basically this fixes everything. From there I just use an alias to set my remote server address (may not be necessary if your ollama server is local)
alias fabric="fabric --remoteOllamaServer 192.0.2.1:11434"
Thanks @0x12d3 removing everything in the env file apart from my YT API key seems to have helped me move onto the next problem: Error: 'NoneType' object has no attribute 'chat' 'NoneType' object has no attribute 'chat' which I'm now looking into. Thank you for the pointer... so many other posts suggesting "NULL". :)
(Second issue: Seemed I had the wrong model... llama3:latest model made it work as suggested here https://github.com/danielmiessler/fabric/issues/448 )
I have ollama installed locally but I would always want to use the remote. Even with OPENAI_BASE_URL=http://x.x.x.x:11434/v1, fabric will only list the local models, though it would call /models on the remote... I removed "v1" to double check it's trying:
fabric --listmodels
Error: Client error '404 Not Found' for url 'http://x.x.x.x:11434/models'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404
When I have OLLAMA_HOST=x.x.x.x in my bashrc, it seems to work as expected and lists models available on the remote and ignores local.
It would help I think if fabric could red OLLAMA_HOST from its own .evn so we don't have to set it globaly or per session for fabric I think, but also check why it lists local models, even though it's calling the remote as per OPENAI_BASE_URL=http://x.x.x.x:11434 ... note it's http and not https