Fabric icon indicating copy to clipboard operation
Fabric copied to clipboard

[Bug]:

Open y3rawat opened this issue 1 year ago • 7 comments

What happened?

fabric --text "Essay about the dog" Error: 'NoneType' object has no attribute 'chat' 'NoneType' object has no attribute 'chat'

don't know what happened

Version check

  • [X] Yes I was.

Relevant log output

No response

Relevant screenshots (optional)

No response

y3rawat avatar May 20 '24 03:05 y3rawat

I get the same error when trying to pip yt to fabric.

yt --transcript "<some Youtube link>" | fabric -sp extract_wisdom
Error: 'NoneType' object has no attribute 'chat'
'NoneType' object has no attribute 'chat'

Is there some way to enable verbose debugging to dig further into this? I don't see a fabric --version support either, so the only version info I have is the commit 1629cad I used to install fabric.

erichschroeter avatar May 20 '24 14:05 erichschroeter

I had to use the export command again within WSL to set API base URL\Key etc. before it started working again.

bamit99 avatar May 21 '24 07:05 bamit99

I had to use the export command again within WSL to set API base URL\Key etc. before it started working again.

Is this the result of me trying to use a self-hosted ollama LLM?

$ export OPENAI_BASE_URL=https://127.0.0.1:11444/v1/
$ export DEFAULT_MODEL="llama3"
$ export YOUTUBE_API_KEY="my-key"
$ yt --transcript "<some Youtube link>" | fabric -sp extract_wisdom
Error: 'NoneType' object has no attribute 'chat'
'NoneType' object has no attribute 'chat'

erichschroeter avatar May 21 '24 12:05 erichschroeter

I'm using a Google API key during fabric --setup command. I only provide Deepseek API information while setting environment. I haven't tried with Ollama yet even though I'm running it. Check Ollama log to see if Fabric is making any connection. Also try and run YT command only to see if it is pulling the transcript properly before piping it into Fabric.

bamit99 avatar May 21 '24 15:05 bamit99

I'm using a Google API key during fabric --setup command. I only provide Deepseek API information while setting environment. I haven't tried with Ollama yet even though I'm running it. Check Ollama log to see if Fabric is making any connection. Also try and run YT command only to see if it is pulling the transcript properly before piping it into Fabric.

If I put OPENAI_API_KEY=ollama in my ~/.config/fabric/.env I'm able to get some feedback that something is hitting my local ollama instance, but the error is:

Error: Connection error.
Connection error.

I event tried changing the command to yt --transcript "<some Youtube link>" | fabric -sp extract_wisdom --remoteOllamaServer='http://localhost:11444' but that still gets the same error. The ollama logs show that /api/tags API is being hit.

erichschroeter avatar May 21 '24 16:05 erichschroeter

Only found this while ts..I tried with Ollama on my machine and yes, only got the API messge or connection error but never recieved a response.

h__ps://knasmueller.net/running-fabric-locally-with-ollama

bamit99 avatar May 22 '24 08:05 bamit99

I was able to get it working with LM Studio though, the Open Source Models are not taking the commands properly. This is my .env file under /home//.config/fabric/ folder. I have Google and Youtube API keys in it along with Deepseek keys. Deepseek keys are hashed out currently. I am using the below for pointing to LM Studio. I am using Smaug 7B model.

image

bamit99 avatar May 23 '24 12:05 bamit99

Pretty sure this is WSL/Windows related. Should be fixed in upcoming Go version.

danielmiessler avatar May 23 '24 17:05 danielmiessler

I ran into this same issue on my mac and an ollama server, and after debugging further found that the model I was referencing was not available. I referenced the list from fabric --listmodels and found references to models with tags, updated my model argument i.e. went from --model llama to --model llama3:latest, and was able to get this working.

ChrisSwanson avatar May 29 '24 00:05 ChrisSwanson

Pretty sure this is WSL/Windows related. Should be fixed in upcoming Go version.

Same problem here in Ubuntu 22.04 (no WSL) just now after a fresh install of fabric:

kzi@Jellyfish:~$ fabric --text "Essay about something"
Error: 'NoneType' object has no attribute 'chat'
'NoneType' object has no attribute 'chat'

kzi90 avatar May 29 '24 13:05 kzi90

I ran into this same issue on my mac and an ollama server, and after debugging further found that the model I was referencing was not available. I referenced the list from fabric --listmodels and found references to models with tags, updated my model argument i.e. went from --model llama to --model llama3:latest, and was able to get this working.

This worked for me too. Thanks!

For future reference, I ran ollama with OLLAMA_MODELS="/usr/share/ollama/.ollama/models" OLLAMA_HOST=127.0.0.1:11444 ollama serve in WSL2. My full command was yt --transcript "<some Youtube link>" | fabric -sp extract_wisdom --remoteOllamaServer='http://localhost:11444' --model llama3:latest

erichschroeter avatar May 29 '24 13:05 erichschroeter

I'm using a Google API key during fabric --setup command. I only provide Deepseek API information while setting environment. I haven't tried with Ollama yet even though I'm running it. Check Ollama log to see if Fabric is making any connection. Also try and run YT command only to see if it is pulling the transcript properly before piping it into Fabric.

If I put OPENAI_API_KEY=ollama in my ~/.config/fabric/.env I'm able to get some feedback that something is hitting my local ollama instance, but the error is:

Error: Connection error.
Connection error.

I event tried changing the command to yt --transcript "<some Youtube link>" | fabric -sp extract_wisdom --remoteOllamaServer='http://localhost:11444' but that still gets the same error. The ollama logs show that /api/tags API is being hit.

Hey I use WSL without any ollama model only Claude & Youtube API, I had the same error below:

Error: Connection error.
Connection error.

When I checked my .env file I noticed I had another part of OpenAI API ="" so I deleted it. from then until now I get the error:

Error: 'NoneType' object has no attribute 'chat'
'NoneType' object has no attribute 'chat'

CyRamos avatar Jun 04 '24 01:06 CyRamos

I ran into this same issue on my mac and an ollama server, and after debugging further found that the model I was referencing was not available. I referenced the list from fabric --listmodels and found references to models with tags, updated my model argument i.e. went from --model llama to --model llama3:latest, and was able to get this working.

Bingo! Thank you so much @erichschroeter! This fixed it for me. MacOS @ M3 Max

I think there's a lack of error handling when the model is not found in the ollama server, I met this project this week but am willing to contribute cc @danielmiessler

melloc01 avatar Aug 04 '24 19:08 melloc01