gpt4all
gpt4all copied to clipboard
[Feature] Connect to OpenAI compatible server other than ChatGPT
Feature request
It should be possible to add custom deployment endpoint to any openai instance (in this case Azure). Since the API is the same, it should not be too difficult to implement.
Motivation
To use your custom openai instance. To reduce usage cost. To control your data. To control the system prompt.
Your contribution
I could help in testing.
Unclear what you are asking for here.
I have the same question. Azure OpenAI is the same as OPENAI but with different endpoints. As in the application for GPT models, you can set only API KEY, but not the endpoint, Azure OpneIAI service can't be used. In some libraries, Azure OpenAI is implemented. as https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints For REST API the documentation is here : https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
Openai API url is defined here: https://github.com/nomic-ai/gpt4all/blob/4d855afe973a08ad81966d22adaadc5916b9126d/gpt4all-chat/chatgpt.cpp#L157
If there was an option to configure the openai api endpoint to QUrl openaiUrl("https://example-endpoint.openai.azure.com/v1/chat/completions");
, it'd allow users to interact with their azure openai instance since both API's are the same.
any update on this?
Is this something we can expect to see?