ApeRAG icon indicating copy to clipboard operation
ApeRAG copied to clipboard

[Features] Local LM Studio Support

Open leiwu0227 opened this issue 2 months ago • 3 comments

Image

Hello, thanks for the awesome package. When I try to use local models from LM Studio, and I get this image in the picture that says the model has to be public to be set as default model, but there is no way for me to change the scope. Am I doing anything wrong? Thanks

Image

leiwu0227 avatar Sep 15 '25 14:09 leiwu0227

Hopefully you can find how to implement this feature looking at https://github.com/apecloud/ApeRAG/issues/1349

Since LiteLLM supports LMStudio (https://docs.litellm.ai/docs/providers/lm_studio), what you need to do is just to modify the code from #1349 by replacing 'ollama' with 'lm_studio', replacing '11434' with '1234' and using base_url=http://localhost:1234.

This is just my wild guesstimate without testing :)

Phil2025Code avatar Sep 16 '25 01:09 Phil2025Code

A simpler workaround I found is to modify the default OpenAI public model URL and model names to the local URL and models running on LM Studio. Then setting the "Default models configuration" to OpenAI. It successfully worked using gpt-oss-20b and qwen3-embedding-4b. I did not try any rerank models.

alex0com avatar Sep 25 '25 18:09 alex0com

This issue has been marked as stale because it has been open for 30 days with no activity

github-actions[bot] avatar Nov 10 '25 00:11 github-actions[bot]