typesense
typesense copied to clipboard
Use openAI compatible API on create models (/conversations/models)
I think it would be very interesting to have the possibility to create conversational models with any type of OpenAI compatible API, right now it is only possible to use OpenAI, CloudFlare or VLLM. I know that the easy answer is "use VLLM", but it seems to me that the changes would be small if it were added as an option to the case of OpenAI, as the option to use OpenAI compatible API has been added when generating the schema of a collection (https://typesense.org/docs/26.0/api/vector-search.html#using-openai-compatible-apis). This can not only help to reduce costs, but mainly to projects where complete privacy is required, and it can simplify and open up more the number of possibilities that exist to run the models locally.
This is something I would look forward to as well as Ollama has an OpenAI Compatible API which I had hoped to use, but then I discovered that the OpenAI Urls were hardcoded
https://github.com/typesense/typesense/blob/f2d1e23341be09240138124119fc66dfc177da45/include/conversation_model.h#L49-L50
I'm no C++ dev, but if someone can point me in the right direction I can make the changes.
https://ollama.com/blog/openai-compatibility https://github.com/ollama/ollama/blob/main/docs/openai.md
Part of v29.0 GA release.