devchat icon indicating copy to clipboard operation
devchat copied to clipboard

Add support of custom endpoints or local LLM endpoints

Open jagad89 opened this issue 1 year ago • 4 comments

jagad89 avatar Jan 22 '24 06:01 jagad89

@jagad89 we don't support this out of the box.

The plan is to enable local endpoints if their APIs are compatible with OpenAI's. Then you could create a provider with the base URL set to your endpoint, in a similar way like below, and change the providers of your models in the Settings as well.

image

Is this meeting your requirement? Could you kindly tell more about what local LLMs you are using and what scenarios you plan to use DevChat for?

In any event, we will let you know when the above feature is ready.

TODOs:

  • @yangbobo2021 make sure the Settings are sync'ed with ~/.chat/config.yml and new providers in the config file is shown in Settings.
  • @basicthinker write docs for this feature.

basicthinker avatar Jan 22 '24 11:01 basicthinker

I will try the settings with few custom and local endpoints. I will keep posted in same thread.

Basically we will able to use local LLM over intranet without internet connection.

jagad89 avatar Jan 23 '24 16:01 jagad89

image

The configuration modification entry has been changed. Additionally, if it is a local model, the model configuration file needs to be manually modified. Interface modification is not supported at the moment, but it will be planned for modification along with some customer demands in the next steps.

runjinz avatar Apr 07 '24 03:04 runjinz

I would like to use this extension with a Local LLM, so I don't have to sign up for a paid service. Ollama is a great way to set up a local endpoint on a server or developer workstation.

https://ollama.com/

trevorstr avatar May 10 '24 15:05 trevorstr