Fabric
Fabric copied to clipboard
[Bug]: Ollama not working: object has no attribute 'chat'
What happened?
I was doing fabric % cat t.txt | fabric --model ollama:3-8B --pattern summarize, when THAT happened Error:
'NoneType' object has no attribute 'chat'
I was expecting that ollama will work with fabric on my MacBook, so I don't paid the API from OpenAI.
Version check
- [X] Yes I was.
Relevant log output
No response
Relevant screenshots (optional)
No response
This can be resolved by: export OPENAI_API_KEY="NULL"
@krishneshg this didn't resolve the issue, I also attempted to adding permanent params in the config file for fabric but still getting either this error above or a 401 message.
The issue here is with your ollama setup. I would reinstall it and/or make sure it's up and running.
I actually got the first mentioned error while trying to use my self-hosted Ollama, which btw is working perfectly using other tools, so a configuration/installation error can basically be excluded.
After adding the suggested env-vars, the None error went away, but now fabric fails differently saying:
Error: Client error '404 Not Found' for url 'https://ollama.xxx.com/v1/models'
There is no such endpoint under /v1/models. Looking at the docs, my assumption only gets confirmed.
Did I miss something crucial?
Same for me. Trying to connect to remote ollama server Error: Client error '404 Not Found' for url https://myserver/v1/models
Could it be related to this issue:
Seems that ollama api is not fully compatible with openai api spec https://github.com/ollama/ollama/issues/2430
Same issue here. Ollama is not the issue
Got it working.
- vim ~/.config/fabric/.env
- add the following: (Replacing the previous content that had just #No API key set.)
OPENAI_API_KEY="NULL"
OPENAI_BASE_URL=https://127.0.0.1:11434/v1/
CLAUDE_API_KEY="NULL"
GOOGLE_API_KEY="NULL"
Should work now. The last one that worked for me was making sure OPENAI_BASE_URL was https not http even if it is localhost...
Make sure to unset your local variables in case you 'exported' them in one of the earlier steps so that it uses the .env file.
eg: unset OPENAI_BASE_URL
After reading code: https://github.com/danielmiessler/fabric/blob/1dd3bbfdf372ae63c4f4a33f7428595b8428f85a/installer/client/cli/utils.py#L43-L54
So:
- Making sure
OPENAI_API_KEY,OPENAI_BASE_URLis not exported.echo $OPENAI_API_KEYandecho $OPENAI_BASE_URLshould return an empty string.- If return any value then check
/etc/environmentfile and/etc/profile.dfolder.
- And make sure your
/.config/fabric/.envdoesn't contain any key that starts withOPENAI_fabric should use the default ollama URL. - And make sure your
/.config/fabric/.envhaveDEFAULT_MODELkey. (ex.DEFAULT_MODEL=llama3:latest)
You could need to restart after unset env variables
I'm using ollama just fine.
Maybe you're not using the models as listed via the output of fabric --listmodels.
For example, I put llama3:latest, not ollama:...
fabric --listmodels
GPT Models:
Local Models:
CognitiveComputations/dolphin-2.9.2-qwen2-7b:Q2_K
deepseek-coder-v2:latest
llama3:latest
Claude Models:
Google Models:
how can I make a switch for groq api and local llm ollama so that if I run out of tokens I use the local model the interface frendly way without changing the env every time?