Fabric icon indicating copy to clipboard operation
Fabric copied to clipboard

[Bug]: Ollama not working: object has no attribute 'chat'

Open Morphus01 opened this issue 1 year ago • 2 comments

What happened?

I was doing fabric % cat t.txt | fabric --model ollama:3-8B --pattern summarize, when THAT happened Error:

'NoneType' object has no attribute 'chat'

I was expecting that ollama will work with fabric on my MacBook, so I don't paid the API from OpenAI.

Version check

  • [X] Yes I was.

Relevant log output

No response

Relevant screenshots (optional)

No response

Morphus01 avatar May 21 '24 18:05 Morphus01

This can be resolved by: export OPENAI_API_KEY="NULL"

krishneshg avatar May 22 '24 10:05 krishneshg

@krishneshg this didn't resolve the issue, I also attempted to adding permanent params in the config file for fabric but still getting either this error above or a 401 message.

AriaShishegaran avatar May 23 '24 10:05 AriaShishegaran

The issue here is with your ollama setup. I would reinstall it and/or make sure it's up and running.

danielmiessler avatar May 23 '24 17:05 danielmiessler

I actually got the first mentioned error while trying to use my self-hosted Ollama, which btw is working perfectly using other tools, so a configuration/installation error can basically be excluded.

After adding the suggested env-vars, the None error went away, but now fabric fails differently saying:

Error: Client error '404 Not Found' for url 'https://ollama.xxx.com/v1/models'

There is no such endpoint under /v1/models. Looking at the docs, my assumption only gets confirmed.

Did I miss something crucial?

bennyzen avatar May 28 '24 17:05 bennyzen

Same for me. Trying to connect to remote ollama server Error: Client error '404 Not Found' for url https://myserver/v1/models

tim-leaf-cloud avatar Jun 01 '24 08:06 tim-leaf-cloud

Could it be related to this issue:

Seems that ollama api is not fully compatible with openai api spec https://github.com/ollama/ollama/issues/2430

tim-leaf-cloud avatar Jun 03 '24 07:06 tim-leaf-cloud

Same issue here. Ollama is not the issue

DmacMcgreg avatar Jun 18 '24 23:06 DmacMcgreg

Got it working.

  1. vim ~/.config/fabric/.env
  2. add the following: (Replacing the previous content that had just #No API key set.)
OPENAI_API_KEY="NULL"
OPENAI_BASE_URL=https://127.0.0.1:11434/v1/
CLAUDE_API_KEY="NULL"
GOOGLE_API_KEY="NULL"

Should work now. The last one that worked for me was making sure OPENAI_BASE_URL was https not http even if it is localhost...

Make sure to unset your local variables in case you 'exported' them in one of the earlier steps so that it uses the .env file. eg: unset OPENAI_BASE_URL

DmacMcgreg avatar Jun 19 '24 14:06 DmacMcgreg

After reading code: https://github.com/danielmiessler/fabric/blob/1dd3bbfdf372ae63c4f4a33f7428595b8428f85a/installer/client/cli/utils.py#L43-L54

So:

  • Making sure OPENAI_API_KEY, OPENAI_BASE_URLis not exported.
    • echo $OPENAI_API_KEY and echo $OPENAI_BASE_URL should return an empty string.
    • If return any value then check /etc/environment file and /etc/profile.d folder.
  • And make sure your /.config/fabric/.env doesn't contain any key that starts with OPENAI_ fabric should use the default ollama URL.
  • And make sure your /.config/fabric/.env have DEFAULT_MODEL key. (ex. DEFAULT_MODEL=llama3:latest)

You could need to restart after unset env variables

CorrM avatar Jun 20 '24 09:06 CorrM

I'm using ollama just fine.

Maybe you're not using the models as listed via the output of fabric --listmodels.

For example, I put llama3:latest, not ollama:...


fabric --listmodels                                       
GPT Models:

Local Models:
CognitiveComputations/dolphin-2.9.2-qwen2-7b:Q2_K
deepseek-coder-v2:latest
llama3:latest

Claude Models:

Google Models:

funkytaco avatar Jul 01 '24 08:07 funkytaco

how can I make a switch for groq api and local llm ollama so that if I run out of tokens I use the local model the interface frendly way without changing the env every time?

owen98fox avatar Jul 13 '24 11:07 owen98fox