DeepCode icon indicating copy to clipboard operation
DeepCode copied to clipboard

[Question]: Do you sopport local LLMs (ollama and LM Studio) ?

Open velteyn opened this issue 3 months ago • 5 comments

Do you need to ask a question?

  • [x] I have searched the existing question and discussions and this question is not already answered.
  • [x] I believe this is a legitimate question, not just a bug or feature request.

Your Question

Hello, my question is simple , do you support local LLM manager such Ollama or LM Studio ? I tried to put http://localhost:1234 in my configuration but I obtain loads of errors so I think this is not supported. Thank you

Additional Context

No response

velteyn avatar Aug 31 '25 07:08 velteyn

The documentation mentions that the underlying system uses the OpenAI SDK. For local deployment, you can use tools like vLLM or other compatible methods to deploy in an OpenAI-compatible manner. Simply configure the api_key as 'EMPTY' and set the base_url to the service address where your model is deployed, and it should work.

qinantong avatar Sep 01 '25 13:09 qinantong

The documentation mentions that the underlying system uses the OpenAI SDK. For local deployment, you can use tools like vLLM or other compatible methods to deploy in an OpenAI-compatible manner. Simply configure the api_key as 'EMPTY' and set the base_url to the service address where your model is deployed, and it should work.

Thank you very much for your prompt response and clear explanation!

Zongwei9888 avatar Sep 02 '25 07:09 Zongwei9888

The documentation mentions that the underlying system uses the OpenAI SDK. For local deployment, you can use tools like vLLM or other compatible methods to deploy in an OpenAI-compatible manner. Simply configure the api_key as 'EMPTY' and set the base_url to the service address where your model is deployed, and it should work.

Have you already tried Ollama? Which model specifically did you use? Does it work well for you?

Johell1NS avatar Sep 04 '25 10:09 Johell1NS

Ollama won't work; it must be a tool support deploy model in Openai-Compatible way, such as vLLM. As you mentioned, LM Studio seems to support. Image

qinantong avatar Sep 04 '25 12:09 qinantong

ok, thank you.

Johell1NS avatar Sep 04 '25 13:09 Johell1NS