Feature Request: Support Custom Base URL for OpenAI (Azure) Integration
Description:
I'd like to request the addition of a feature that allows users to specify a custom base URL when interacting with OpenAI through this library/framework. This would enable integration with custom OpenAI deployments hosted on Azure or cater to specific security requirements.
Motivation:
Supporting custom base URLs would provide several benefits:
- Increased flexibility for users with private or on-premises OpenAI deployments.
- Enhanced compatibility with internal security policies that might restrict access to the default OpenAI API endpoint.
- Streamlined integration with existing workflows that involve custom OpenAI instances.
Proposed Solution:
One possible implementation could involve adding a configuration option like baseUrl when initializing the OpenAI client. This option would allow users to specify the desired base URL for API requests.
Additional Considerations:
- It would be important to validate the provided base URL to ensure it's a valid URL format.
- Error handling mechanisms should be in place to gracefully handle cases where the custom base URL is unreachable or returns errors.
Labels: feature-request, openai, azure
The issue you've raised can be resolved by leveraging environment variables, which provide a straightforward and elegant solution without introducing unnecessary complexity. The OpenAI libraries have built-in support for configuring the API base URL through an environment variable, eliminating the need for custom implementations or workarounds.
By setting the OPENAI_BASE_URL environment variable to the desired reverse proxy URL, such as export OPENAI_BASE_URL=https://your.reverse.proxy.url/, the OpenAI libraries automatically detect and utilize this configuration. This behavior is clearly evident in the OpenAI Python library's source code, specifically in the _client.py file at line 108 (https://github.com/openai/openai-python/blob/5cfb125acce0e8304d12bdd39b405071021db658/src/openai/_client.py#L108):
if base_url is None: base_url = os.environ.get("OPENAI_BASE_URL") if base_url is None: base_url = f"https://api.openai.com/v1"
As demonstrated in the code snippet above, the library first checks if a base_url parameter is explicitly provided. If not, it proceeds to retrieve the value of the OPENAI_BASE_URL environment variable. Only if both the parameter and environment variable are absent does it fall back to the default OpenAI API base URL.
By leveraging this built-in functionality, you can seamlessly configure the OpenAI libraries to use your reverse proxy URL without the need for additional code modifications or complex setup procedures. This approach ensures maintainability, portability, and adherence to best practices, as it relies on a well-established and widely-used mechanism for configuration management.
Hi, the information is not working, I try to use update base URL but still did not work, the method work is set openai.api_base = "proxy_url"
@segmentationf4u1t I have set the following env vars, however I get openai.NotFoundError error when I send any message.
export OPENAI_BASE_URL=https://xyz.openai.azure.com
export AZURE_OPENAI_ENDPOINT=https://xyz.openai.azure.com
export AZURE_OPENAI_API_KEY=xyz
export OPENAI_API_VERSION=2023-05-15
@segmentationf4u1t I have set the following env vars, however I get
openai.NotFoundErrorerror when I send any message.export OPENAI_BASE_URL=https://xyz.openai.azure.com export AZURE_OPENAI_ENDPOINT=https://xyz.openai.azure.com export AZURE_OPENAI_API_KEY=xyz export OPENAI_API_VERSION=2023-05-15
Thats correct, azure might bring some troubles (im uncertain how does azure handle the openai api), specifically in terms of payload and response standards. It's important to note that the OPENAI_BASE_URL environment variable should be used primarily for reverse-proxies and/or inference structure clones for the BASE api model, as mentioned above.
https://github.com/stitionai/devika/pull/69 duplicate https://github.com/stitionai/devika/issues/55