Fabric icon indicating copy to clipboard operation
Fabric copied to clipboard

I can't use LM Studio in the new Fabric installed via GO, pls help

Open JohnnyDisco7 opened this issue 1 year ago • 21 comments

Discussed in https://github.com/danielmiessler/fabric/discussions/857

Originally posted by poezie August 21, 2024 I have move to the new Fabric install using Go , which has introduced a few more things. One query I have is I used to use LM Studio instead of Ollama , but atm it appears only Ollama is support . If that is not the case please can someone advise how I can setup the new fabric install and use my local Lm studio install instead of Ollama ?

JohnnyDisco7 avatar Sep 05 '24 16:09 JohnnyDisco7

Get the model name you're using, then run:

ollama pull model-name

If it's not available, download the model from Hugging Face. Create a Modelfile, and run:

ollama create -n model-name

You can find more detailed instructions in the Ollama docs, but in most cases, Ollama will have the model you need. if still facing hurdles overcome them with chatgpt q&a

sosacrazy126 avatar Sep 06 '24 10:09 sosacrazy126

Thanks @sosacrazy126. I'd really like to use LM Studio, but will use Ollama for now. Thanks for the suggestion.

JohnnyDisco7 avatar Sep 06 '24 16:09 JohnnyDisco7

@JohnnyDisco7

You don't need to modify any code to use LM Studio with Fabric. Just follow these steps:

Set the following environment variables:

makefile

FABRIC_OPENAI_BASEURL=<your LM Studio API endpoint> FABRIC_OPENAI_APIKEY= FABRIC_DEFAULT_VENDOR=OpenAI FABRIC_DEFAULT_MODEL=<your LM Studio model name>

Example:

makefile

FABRIC_OPENAI_BASEURL=http://localhost:1234/v1
FABRIC_OPENAI_APIKEY=not-needed
FABRIC_DEFAULT_VENDOR=OpenAI
FABRIC_DEFAULT_MODEL=llama2-7b

You can set these environment variables before running Fabric, or add them to your .env file.

Fabric will now connect to your LM Studio instance as if it were an OpenAI-compatible service. No need to modify any code or files like models.go. ## not tryna overstep

sosacrazy126 avatar Sep 06 '24 21:09 sosacrazy126

Thanks for the advice. I tried everything you mentioned, and no luck. I think I have a corrupt install or files from the previous python install which are causing issues. I've checked .env files and everything's fine, and have tried reinstalling the new Go version, but when I run setup it shows the same previous settings. Any idea how to completely uninstall fabric and start over?

JohnnyDisco7 avatar Sep 11 '24 22:09 JohnnyDisco7

@JohnnyDisco7 my bad bro it was wayy more complicated then changing base url ... but i got the job done Screenshot from 2024-09-11 23-19-12

sosacrazy126 avatar Sep 12 '24 03:09 sosacrazy126

@sosacrazy126 Sweet! Do you mind showing what you used for your setup?

JohnnyDisco7 avatar Sep 12 '24 04:09 JohnnyDisco7

@JohnnyDisco7

https://github.com/sosacrazy126/fabric.git

sosacrazy126 avatar Sep 12 '24 05:09 sosacrazy126

@JohnnyDisco7

@sosacrazy126 Sweet! Do you mind showing what you used for your setup?

check here and lmk if u still needd help >> https://github.com/sosacrazy126/fabric/tree/main/

sosacrazy126 avatar Sep 12 '24 16:09 sosacrazy126

Hey, I have started Local Server with LM Studio, Loaded lmstudio-community/Meta-Llama-3.1-8B-Instruct-GGUF model, exported below variables

export DEFAULT_MODEL="lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF"
export DEFAULT_VENDOR=OpenAI
export OPENAI_APIKEY=not-needed
export OPENAI_BASE_URL=http://localhost:1234/v1

I'm on ubuntu 22.04.5 LTS, LM Studio 0.3.2, Fabric latest current version, v1.4.16, installed with go.

I am getting

could not find vendor.
 Model = 
 DefaultModel = lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF
 DefaultVendor = OpenAI

what do I miss? cheers ;)

EDIT: yes, I have updated go, currently using 1.23.1, I have removed previous version, with pipx uninstall fabric

irvis007 avatar Sep 16 '24 13:09 irvis007

Additional info,

which fabric
$HOME/.gvm/pkgsets/go1.23.1/global/bin/fabric

removed references in .bashrc, and .zshrc, files.

irvis007 avatar Sep 16 '24 13:09 irvis007

Update: After LM Studio restart I get some other error,

echo "tell me why" | fabric -p ai
error, status code: 401, message: Incorrect API key provided: not-needed. You can find your API key at https://platform.openai.com/account/api-keys.

LM Srudio logs says:

2024-09-16 15:53:34 [ERROR] Unexpected endpoint or method. (GET /v1/api/tags). Returning 200 anyway

My local server set up: image

Does LM Studio introduced API key for local server?

irvis007 avatar Sep 16 '24 13:09 irvis007

Another hit, I have found, Should fabric has only one model?

fabric --listmodels  

Available vendor models:

DryRun

	[1]	dry-run-model

should I install more models somehow?

Where can I find currently supported ENV VARs I can use?

irvis007 avatar Sep 17 '24 10:09 irvis007

@sosacrazy126 was your merge to support Lm Studio added to the fabric main project or only in your branch ?

poezie avatar Sep 22 '24 12:09 poezie

It was merged, but the workflow kept failing, so it got removed or deleted. The PR team reached out and gave me refactoring tips, and I ended up chasing my tail. I don't even use LM Studio, but since I saw everyone asking about it, I’ll finalize it and figure out the workflow issue. @poezie @irvis007

sosacrazy126 avatar Sep 22 '24 12:09 sosacrazy126

In the meantime, if anyone wants to use my branch, I’ve pushed the working code to my own repo (just the workflow issue remains): https://github.com/sosacrazy126/fabricc.git. ill have pr review my code today or tomorrow and merge again

sosacrazy126 avatar Sep 22 '24 12:09 sosacrazy126

@sosacrazy126 thanks for info from your side!

irvis007 avatar Sep 25 '24 18:09 irvis007

In the meantime, if anyone wants to use my branch, I’ve pushed the working code to my own repo (just the workflow issue remains): https://github.com/sosacrazy126/fabricc.git. ill have pr review my code today or tomorrow and merge again

Thanks for the fork ! I made it work with LM Studio but, is it intended that I can't get it to work with a remove LM Studio ? I set all the variables to the IP hosting the API and it still try to use localhost.

I'm using it through a socat tunnel atm but was wondering if I was doing something wrong

h3xitsec avatar Sep 26 '24 20:09 h3xitsec

@h3xitsec If you're trying to remove a model you're currently using, you'll need to run 'fabric --setup'. If that's not what you're attempting, could you provide more details about the issue you're facing? Also, since this was my first time contributing, please let me know if I've overlooked any important features or steps.

sosacrazy126 avatar Sep 27 '24 15:09 sosacrazy126

Hey @sosacrazy126 , May I ask about status? is this issue is abandoned/on-hold/awaiting/other. BR ;)

irvis007 avatar Nov 15 '24 07:11 irvis007

I just opened a PR that should allow the LM Studio integration Credit goes to @sosacrazy126 I based it on his work. https://github.com/danielmiessler/fabric/pull/1302

verebes1 avatar Feb 17 '25 23:02 verebes1

This has now been merged so the issue can be closed 🙂

verebes1 avatar Feb 25 '25 07:02 verebes1

This is working, yes.

lms ls

You have 2 models, taking up 9.09 GB of disk space.

LLMs (Large Language Models)      PARAMS      ARCHITECTURE         SIZE      
qwen3-14b                            14B         qwen3          9.00 GB      

Embedding Models                          PARAMS      ARCHITECTURE          SIZE      
text-embedding-nomic-embed-text-v1.5                   Nomic BERT       84.11 MB      

And in Fabric:

fabric -L | grep 'LM'
       	[97]	LM Studio|qwen3-14b
       	[98]	LM Studio|text-embedding-nomic-embed-text-v1.5
       	[619]	Together|zai-org/GLM-4.5-Air-FP8

ksylvan avatar Aug 19 '25 03:08 ksylvan