Librum-Server
Librum-Server copied to clipboard
Self-Hosted AI Server support
Added support for using self hosted AI models. The self-hosted AI models should support OpenAI API calls.
Most selfhosted api servers need model
parameter in their requests, which differs from Openai model names. Hence, with current architecture, I cannot use any selfhosted server. One possible solution is to pass model
parameter as an environment variables.