Support OPENAI_BASE_URL in addition to OPENAI_API_BASE
Title
OPENAI_BASE_URL is the new canonical ENV variable. This change supports that primarily while still retaining support for OPENAI_API_BASE for transition.
Relevant issues
Fixes #7829
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
- [x] I have Added testing in the
tests/litellm/directory, Adding at least 1 test is a hard requirement - see details - [x] I have added a screenshot of my new test passing locally
- [x] My PR passes all unit tests on (
make test-unit)[https://docs.litellm.ai/docs/extras/contributing_code] - [x] My PR's scope is as isolated as possible, it only solves 1 specific problem
Type
🆕 New Feature
Changes
- Add primary env support for OPENAI_BASE_URL in all places OPENAI_API_BASE was used
- Correct docs that both include /v1
- Add an example unit test that proves both ENV variables work
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| litellm | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Apr 25, 2025 2:32am |
tested manually with a demo app and all good. cc @xrmx @anuraaga
diff --git a/main.py b/main.py
index 358fafb..f4518c8 100644
--- a/main.py
+++ b/main.py
@@ -4,8 +4,6 @@ import litellm
from litellm import completion
from litellm.integrations.opentelemetry import OpenTelemetry, OpenTelemetryConfig
-# LiteLLM uses old ENV until https://github.com/BerriAI/litellm/issues/7829
-os.environ["OPENAI_API_BASE"] = os.getenv("OPENAI_BASE_URL")
# LiteLLM uses custom ENV until https://github.com/BerriAI/litellm/issues/9901
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT") + "/v1/traces"
otel_config = OpenTelemetryConfig(exporter="otlp_http", endpoint=otlp_endpoint)
diff --git a/requirements.txt b/requirements.txt
index 5d3b9da..34c14c4 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,5 +1,6 @@
# current versions are missing packages if you only use 'litellm'
-litellm[proxy]~=1.65.6
+# litellm[proxy]~=1.65.6
+litellm[proxy] @ git+https://github.com/codefromthecrypt/litellm.git@OPENAI_BASE_URL
opentelemetry-sdk~=1.32.0
opentelemetry-exporter-otlp-proto-http~=1.32.0
Long awaited update @codefromthecrypt thanks for your work on this!
When can we expect this to be merged?
reviewed, lmk once addressed @codefromthecrypt
@ishaan-jaff over to you!
@ishaan-jaff PTAL I think we're good!
@ishaan-jaff nudge mainly to get this behind us. I have a lot of little code I would like to tidy up once released. Thanks for your review despite so many things on the plate!