fabric icon indicating copy to clipboard operation
fabric copied to clipboard

[Bug]: Unable to use Ollama if OpenAI/Claude API key is not set

Open sdkks opened this issue 11 months ago • 7 comments

What happened?

I was trying to use Fabric with only local Ollama server, entered no (blank) API keys to any online models (OpenAI, Claude)

I was expecting I would be able to list local Ollama models at least, however it will require setup:

fabric --remoteOllamaServer http://127.0.0.1:11434 --listmodels
Please run --setup to set up your API key and download patterns.

I also entered dummy API keys for them upon restarting the setup, then tried again:

fabric --remoteOllamaServer http://127.0.0.1:11434 --listmodels
Error: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/models'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401

It quits due to incorrect API key, without even trying Claude or local Ollama.

I was expecting:

  • It could run without any setup, just with local Ollama server
  • It wouldn't die on wrong auth credentials but try next available servers/providers.

Is there a way to run Fabric only locally without any API key configs?

Version check

  • [X] Yes I was.

Relevant log output

See above inline my description please.

Relevant screenshots (optional)

No response

sdkks avatar Mar 21 '24 10:03 sdkks

I add a comment as I have the same issue, written in an other thread, but it's better if we discuss it in one place.

I'm on Windows with Ollama and fabric running in WSL, I also have no OpenAI or Claude API.

Thank you

Papoulos avatar Mar 21 '24 13:03 Papoulos

I add a comment as I have the same issue, written in an other thread, but it's better if we discuss it in one place.

I'm on Windows with Ollama and fabric running in WSL, I also have no OpenAI or Claude API.

Thank you

Hey Papoulos,

If it's an earlier issue or something with more detail, we can close this one and continue in your issue.

Thanks!

sdkks avatar Mar 21 '24 15:03 sdkks

FWIW, I was able to work around this problem by creating an OpenAI account, then creating a readonly token which can't be used for generation, after which Fabric was able to list all the models like below:

terminal output
 Dev/fabric   main  fabric --remoteOllamaServer http://127.0.0.1:11434 --listmodels
GPT Models:
gpt-3.5-turbo
gpt-3.5-turbo-0125
gpt-3.5-turbo-0301
gpt-3.5-turbo-0613
gpt-3.5-turbo-1106
gpt-3.5-turbo-16k
gpt-3.5-turbo-16k-0613
gpt-3.5-turbo-instruct
gpt-3.5-turbo-instruct-0914

Local Models:
deepseek-coder:33b
dolphin-mixtral:8x7b-v2.7-q2_K
mixtral:8x7b-instruct-v0.1-q4_0
nous-hermes2-mixtral:8x7b-dpo-q3_K_M
wizardcoder:33b-v1.1

Claude Models:
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307
claude-2.1

sdkks avatar Mar 21 '24 15:03 sdkks

I copy the element I write on the other post :

Unfortunatly I can't make it work with Ollama. I have a local Ollama service with Gemma that I can curl with no issue :

curl http://localhost:11434/api/generate -d '{
  "model": "gemma",
  "prompt": "Why is the sky blue?"
}'

{"model":"gemma","created_at":"2024-03-20T09:29:56.609134562Z","response":"The","done":false}
....

But when I try to configure fabric :

fabric --setup  --remoteOllamaServer http://localhost:11434

It ask me for OpenAI, Claude and YT API but did nothing more :

Welcome to Fabric. Let's get started.
Please enter your OpenAI API key. If you do not have one or if you have already entered it, press enter.

Please enter your claude API key. If you do not have one, or if you have already entered it, press enter.

Please enter your YouTube API key. If you do not have one, or if you have already entered it, press enter.

Updating patterns...

Downloaded zip file successfully.
Extracted zip file successfully.
Patterns updated successfully.

If I then ask for model list :

fabric --listmodels
Please run --setup to set up your API key and download patterns.

Even with this command :

fabric --listmodels --remoteOllamaServer http://localhost:11434
Please run --setup to set up your API key and download patterns.

In fact every request send me back to the setup :

fabric --list
Please run --setup to set up your API key and download patterns.

There may certainly be something I didn't do right, but I have no clue for the moment.

fabric$ fabric --setup
Welcome to Fabric. Let's get started.
Please enter your OpenAI API key. If you do not have one or if you have already entered it, press enter.

Please enter your claude API key. If you do not have one, or if you have already entered it, press enter.


Please enter your YouTube API key. If you do not have one, or if you have already entered it, press enter.


Updating patterns...
Downloaded zip file successfully.
Extracted zip file successfully.
Patterns updated successfully.



fabric$ fabric --listmodels
Please run --setup to set up your API key and download patterns.


fabric$ curl http://localhost:11434/api/generate -d '{
  "model": "gemma",
  "prompt": "Why is the sky blue?"
}'
{"model":"gemma","created_at":"2024-03-20T10:14:58.21685401Z","response":"The","done":false}
{"model":"gemma","created_at":"2024-03-20T10:14:58.459595578Z","response":" sky","done":false}

I can't create a read-only token as I didn't have a paid account

Papoulos avatar Mar 21 '24 16:03 Papoulos

@Papoulos I don't have a paid account either and I can create a read-only API key. You can register with ChatGPT without paying anything but you will be using GPT 3.5 instead of GPT 4 which is what Fabric requires. Once you have this free account (and btw, you can just use your Google account if you have one when you authenticate), you can generate keys from this page.

quiet-ranger avatar Mar 21 '24 16:03 quiet-ranger

Indeed, it work better

Papoulos avatar Mar 21 '24 17:03 Papoulos

Having said that, my environment still does not work. But let me describe my setup:

My bare machine runs Linux Mint 21.03 but I decided to install everything on Docker containers. I have a separate container for Ollama and one for Fabric. I have created a "virtual docker network" so that containters can talk to each other (and I have confirmed that this works as shown further below)

$ docker pull ollama/ollama
$ docker network create ainet
$ docker run --network=ainet -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama 

I then created a bare bones container based on Alpine, logged into it, installed dependencies, cloned fabric and went through the motions of building and installing. Incidentally, today I did a git pull and rebuilt it. I can run this container and drop to its console thus:

$ docker run --network=ainet --name fabric -it fabric

So, from inside the fabric container's console I can run:

# curl http://ollama:11434/api/generate -d '{ "model": "llama2", "prompt": "What is the name of planet Earths moon?"}'
{"model":"llama2","created_at":"2024-03-21T17:03:52.825558645Z","response":"\n","done":false}
{"model":"llama2","created_at":"2024-03-21T17:03:53.358737623Z","response":"The","done":false}
{"model":"llama2","created_at":"2024-03-21T17:03:53.746255993Z","response":" name","done":false}

So, container to container communication is working fine. However, after today's git pull, the --remoteOllamaServer option simply appears to hang. I left it running for a couple of hours and it never returned. This seems to be a new problem as before it would return although with no apparent effect.

# fabric --remoteOllamaServer http://ollama:11434 --model llama2

Does anyone have further ideas?

At this point, I have two suggestions for the maintainers:

  1. Provide a means to query the version number (or commit id) of what we are running so we can report problems that can be more easily reproducible
  2. Perhaps a --verbose option might help in these situations

quiet-ranger avatar Mar 21 '24 17:03 quiet-ranger

Edit: Not sure if this will help anyone but in order to use any model with Windows on PowerShell, I had to use this format for every command. I am able to use patterns with Claude and Ollama now.

C:\Users\slyre>  Get-Clipboard | fabric --model mistral:latest --pattern analyze_claims --stream

I'm here because I'm in a similar boat and I haven't been able to get any answers. I've entered my 3 API keys. The only model I've been able to get to work is Claude. Others say insufficient funds.
I've tried Claude with the examples in the readme. I just changed out pbpaste because I'm on PowerShell. None of those commands work though.

Right after switching to Claude I'll either get it can't extract text with Get-Clipboard or:

PS C:\Program Files\Code\fabric> fabric --model claude-3-opus-20240229 Enter Question: Get-Clipboard | fabric --pattern extract_wisdom I apologize, but I cannot execute the command you provided as it appears to be a PowerShell

I was able to get it to work once with this: Get-Clipboard | fabric --model claude-3-opus-20240229 --pattern extract_wisdom but it failed the next time.

Also, if it answers my question or gives me an error, It kicks me out of the chatbot and back into a folder. I have to --model every time.

When those weren't working, I tried using Ollama. I have not done the curl command because I don't understand any of it. I also did not do --remoteOllamaServer as people were showing above because I am running a local Ollama and I left it in the default place. The instructions say "ONLY USE THIS if you are using a local ollama server in an non-deault location or port" I can get Mistral up and running but I don't know how to get it to follow any patterns. I'm not sure If I need to move my patterns folder, move the .olama folder, create a modelfile and paste the patterns in there.

YorkyPoo avatar Mar 22 '24 15:03 YorkyPoo

Thank you for the fix!

danielmiessler avatar Mar 28 '24 19:03 danielmiessler