bloop
bloop copied to clipboard
Support for custom OpenAI-Compatible backends
What's the problem?
Usage of local large language models through LocalAI, LMStudio and so on, all of these are providing OpenAI-compatible API, but application need to expose a setting to change baseurl.
What's the outcome?
- Ability to use Azure OpenAI services, for people who want alternative hosting to OpenAI.
- Ability to use custom large language models (ex. local ones) that are compatible with OpenAI API.
Related Issues: #415 #1094
An ability to use custom backends is really required. Azure OpenAI supporting as well. Even as a paid feature.
Yes, AzureOpenAI support, please.
We've open-sourced the OpenAI API logic (https://github.com/BloopAI/bloop/tree/oss/server/bleep/src/llm). You can now build and run bloop with a custom OpenAI API key (see: https://github.com/BloopAI/bloop?tab=readme-ov-file#building-from-source).
Feel free to open a PR adding support for Azure OpenAI 😀