Archon icon indicating copy to clipboard operation
Archon copied to clipboard

LM Studio support

Open deltaag opened this issue 4 months ago • 8 comments

Great project.

I’m setting up a full local environment on Windows 11 using WSL2 for development. I’ve discovered that installing LM Studio directly on Windows allows me to take full advantage of the GPU.

So far, I’ve managed to run a local Docker instance of Supabase and would love to see LM Studio supported by Archon.

Additionally, since Ollama lets you specify a base URL, could setting it to http://localhost:1234/v1 connect to the LM Studio server?

Also, please consider adding an Update section to the main document. Does it require another docker-compose up --build -d after pulling the updates?

deltaag avatar Aug 14 '25 19:08 deltaag

shouldn't be hard to integrate, i added openrouter to my version of the repo along with a separate embeddings provider selector.

Image

Chillbruhhh avatar Aug 15 '25 02:08 Chillbruhhh

shouldn't be hard to integrate, i added openrouter to my version of the repo along with a separate embeddings provider selector.

That is what I want. @Chillbruhhh, could you please provide your solution? I am not getting extra options other than OpenAI, Google Gemini, and Ollama (Coming Soon)!

deltaag avatar Aug 15 '25 08:08 deltaag

shouldn't be hard to integrate, i added openrouter to my version of the repo along with a separate embeddings provider selector.

Image

That's awesome @Chillbruhhh !

coleam00 avatar Aug 15 '25 18:08 coleam00

shouldn't be hard to integrate, i added openrouter to my version of the repo along with a separate embeddings provider selector.

Image

I would love to try this before it is released on Archon's official repo. Would you like to share the updates you have made?

ubjayasinghe avatar Aug 15 '25 18:08 ubjayasinghe

sure @ubjayasinghe let me button it up and ill post my branch here, ill also submit it for pr

Chillbruhhh avatar Aug 15 '25 19:08 Chillbruhhh

A couple of points got my local setup working that might help someone else out there, or be considered for adding to the docs:

  • local Supabase in Docker Desktop: try SUPABASE_URL=http://host.docker.internal:8000 if SUPABASE_URL=http://localhost:8000 is not working (I was getting credential error on Archon-Server start-up), or you might need the local host IP address (I haven't tested this, but it was a suggested solution).
  • Choosing Ollama as the LLM provider and changing the Base URL to `http://localhost:1234/v1 connected to LM Studio running in Windows. Maybe the option should be renamed as Local LLM (Ollama, LM Studio, etc.), or just the LM Studio option with the URL added.

deltaag avatar Aug 16 '25 08:08 deltaag

shouldn't be hard to integrate, i added openrouter to my version of the repo along with a separate embeddings provider selector. Image

I would love to try this before it is released on Archon's official repo. Would you like to share the updates you have made?

https://github.com/Chillbruhhh/Archon/tree/feature/openrouter-support

I added openrouter because of openAIs tier rate limiting. I ran into this issue when i continuously crawled documentation with coles crawl4ai-rag mcp. feel free to try out the unofficial version

Chillbruhhh avatar Aug 16 '25 18:08 Chillbruhhh

lm-studio's api is openai compatible, it should be trivial to use the openai connector and just change the baseurl

Anaxagoras-bc avatar Aug 16 '25 19:08 Anaxagoras-bc