ApeRAG
ApeRAG copied to clipboard
[Features] Local LM Studio Support
Hello, thanks for the awesome package. When I try to use local models from LM Studio, and I get this image in the picture that says the model has to be public to be set as default model, but there is no way for me to change the scope. Am I doing anything wrong? Thanks
Hopefully you can find how to implement this feature looking at https://github.com/apecloud/ApeRAG/issues/1349
Since LiteLLM supports LMStudio (https://docs.litellm.ai/docs/providers/lm_studio), what you need to do is just to modify the code from #1349 by replacing 'ollama' with 'lm_studio', replacing '11434' with '1234' and using base_url=http://localhost:1234.
This is just my wild guesstimate without testing :)
A simpler workaround I found is to modify the default OpenAI public model URL and model names to the local URL and models running on LM Studio. Then setting the "Default models configuration" to OpenAI. It successfully worked using gpt-oss-20b and qwen3-embedding-4b. I did not try any rerank models.
This issue has been marked as stale because it has been open for 30 days with no activity