chat-ui icon indicating copy to clipboard operation
chat-ui copied to clipboard

Add support for other API endpoints

Open fbarbe00 opened this issue 1 year ago • 4 comments

It would be nice if HuggingChat could be used locally, but calling other remote LLM endpoints other than OpenAI. For instance, this could be mistral.ai 's API endpoints (same as OpenAI - only difference is model name), or a custom server configured for it.

Perhaps just adding a variable in the .env file defining the server? This seems like an easy feature, I could try implementing it myself if I get the time to look a bit more into the code (for instance, figuring out where the model name can be change) https://github.com/huggingface/chat-ui/blob/ee47ff37fddb70f78d1ef8a293d8ed3fbcd24ff9/src/lib/server/endpoints/openai/endpointOai.ts#L13C1-L13C65

fbarbe00 avatar Jan 18 '24 18:01 fbarbe00

It would be nice if HuggingChat could be used locally, but calling other remote LLM endpoints other than OpenAI. For instance, this could be mistral.ai 's API endpoints (same as OpenAI - only difference is model name), or a custom server configured for it.

Perhaps just adding a variable in the .env file defining the server? This seems like an easy feature, I could try implementing it myself if I get the time to look a bit more into the code (for instance, figuring out where the model name can be change) https://github.com/huggingface/chat-ui/blob/ee47ff37fddb70f78d1ef8a293d8ed3fbcd24ff9/src/lib/server/endpoints/openai/endpointOai.ts#L13C1-L13C65

Most providers offer an OpenAI compatible endpoint. For instance, I have no problems using Mixtral served from Together.ai endpoint.

gururise avatar Jan 18 '24 18:01 gururise

Hi :wave: We support a few different types of API endpoints. If your inference server has an openAI compatible endpoint, have a look at this section of the readme.

If that doesn´t suit your needs, or you would like to see a custom endpoint, please let me know!

nsarrazin avatar Jan 19 '24 09:01 nsarrazin

Hello @nsarrazin do you plan to add the support of endpoint type openai for embedding and multi-modal models? :) Thanks!

Extremys avatar Jan 25 '24 14:01 Extremys

@nsarrazin It would be very cool if we could have a override endpoint env variable and also a way to include anything in the header. That way an api can be slightly different from the standard OpenAI endpoint.

edmcquinn avatar Jan 25 '24 17:01 edmcquinn