fabric icon indicating copy to clipboard operation
fabric copied to clipboard

Resolving LLM Configuration Error - .env file config - LM Studio, local model (MAC)[Question]:

Open nic0711 opened this issue 10 months ago • 1 comments

What is your question?

After reading thello, somehow I can't get any further... especially since it worked once before - but not since a few updates.

I always get the following error message and I guess it has to do (among other things) with the settings in the .env.

(base) USER@MBP-von-USER fabric % fabric --listmodels 
Traceback (most recent call last):
  File "/opt/homebrew/bin/fabric", line 6, in <module>
    sys.exit(cli())
             ^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/fabric.py", line 101, in main
    standalone = Standalone(args, args.pattern)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/utils.py", line 56, in __init__
    sorted_gpt_models, ollamaList, claudeList = self.fetch_available_models()
                                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/utils.py", line 297, in fetch_available_models
    if "/" in models[0] or "\\" in models[0]:
              ~~~~~~^^^
IndexError: list index out of range
(base) USER@MBP-von-USER fabric % fabric --listmodels
Error: Connection error. trying to access /models: ("Request URL is missing an 'http://' or 'https://' protocol.",)

My actual .env looks like this:

OPENAI_API_KEY=lmstudio
OPENAI_BASE_URL=http://localhost:1234/v1

DEFAULT_MODEL=lmstudio

YOUTUBE_API_KEY=AI

Can someone tell me please what the .env should look like that primarily accesses the local LLM (via LM Studio) - in the future also Claude.

Thank you!he documentation, I am still not clear how to get X working. I tried this, this, and that.

nic0711 avatar Mar 28 '24 10:03 nic0711

What is your question?

After reading thello, somehow I can't get any further... especially since it worked once before - but not since a few updates.

I always get the following error message and I guess it has to do (among other things) with the settings in the .env.

(base) USER@MBP-von-USER fabric % fabric --listmodels 
Traceback (most recent call last):
  File "/opt/homebrew/bin/fabric", line 6, in <module>
    sys.exit(cli())
             ^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/fabric.py", line 101, in main
    standalone = Standalone(args, args.pattern)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/utils.py", line 56, in __init__
    sorted_gpt_models, ollamaList, claudeList = self.fetch_available_models()
                                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/USER/AI/fabric/installer/client/cli/utils.py", line 297, in fetch_available_models
    if "/" in models[0] or "\\" in models[0]:
              ~~~~~~^^^
IndexError: list index out of range
(base) USER@MBP-von-USER fabric % fabric --listmodels
Error: Connection error. trying to access /models: ("Request URL is missing an 'http://' or 'https://' protocol.",)

My actual .env looks like this:

OPENAI_API_KEY=lmstudio
OPENAI_BASE_URL=http://localhost:1234/v1

DEFAULT_MODEL=lmstudio

YOUTUBE_API_KEY=AI

Can someone tell me please what the .env should look like that primarily accesses the local LLM (via LM Studio) - in the future also Claude.

Thank you!he documentation, I am still not clear how to get X working. I tried this, this, and that.

Ok first I am very novice at any of this. But I finally got it I think by removing the "default model"

this is what my .env is looking like:

OPENAI_API_KEY=lmstudio

OPENAI_BASE_URL=http://localhost:1234/v1/

randomBullets avatar Mar 31 '24 07:03 randomBullets