gradio
gradio copied to clipboard
Allow passing `gr.load_chat(..., token=None)`, `gr.load_chat(..., token="")` or `gr.load_chat(...)` without any meaningful token (for local `vllm serve`-d models, empty token is okay)
Describe the bug
This is useful when using gradio to connect to a local vllm serve instance, so no token passed makes sense.
Currently with token=None or without any token= it gives this error:
Traceback (most recent call last):
File "/mnt/fs/ml/demo.py", line 6, in <module>
gr.load_chat(f"http://{host}:{port}/v1/", model="demo", token=None).launch(share=True)
File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/gradio/external.py", line 809, in load_chat
client = OpenAI(api_key=token, base_url=base_url) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/openai/_client.py", line 124, in __init__
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
gr.load_chat(f"http://{host}:{port}/v1/", model="demo", token="").launch(share=True)
gives the error:
The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/openai/_base_client.py", line 969, in request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_client.py", line 1014, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_transports/default.py", line 249, in handle_request with map_httpcore_exceptions(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__ self.gen.throw(value) File "/mnt/fs/venv_cu126_py312/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.LocalProtocolError: Illegal header value b'Bearer '
Have you searched existing issues? 🔎
- [x] I have searched and found no existing issues
Reproduction
Please see above
Screenshot
No response
Logs
System Info
Gradio Environment Information: ------------------------------ Operating System: Linux
gradio version: 5.32.1
gradio_client version: 1.10.2 ------------------------------------------------
gradio dependencies in your environment:
aiofiles: 24.1.0
anyio: 4.9.0
audioop-lts is not installed. fastapi: 0.115.12 ffmpy: 0.6.0
gradio-client: 1.10.2
groovy: 0.1.2
httpx: 0.28.1
huggingface-hub: 0.31.2
jinja2: 3.1.6
markupsafe: 2.1.5
numpy: 2.1.2
orjson: 3.10.18
packaging: 25.0
pandas: 2.2.3
pillow: 11.0.0
pydantic: 2.11.4
pydub: 0.25.1
python-multipart: 0.0.20
pyyaml: 6.0.2
ruff: 0.11.12
safehttpx: 0.1.6
semantic-version: 2.10.0
starlette: 0.46.2
tomlkit: 0.13.2
typer: 0.15.4
typing-extensions: 4.12.2
urllib3: 2.4.0
uvicorn: 0.34.2
mcp is not installed.
pydantic: 2.11.4
authlib is not installed.
itsdangerous is not installed.
gradio_client dependencies in your environment:
fsspec: 2024.6.1
httpx: 0.28.1
huggingface-hub: 0.31.2
packaging: 25.0
typing-extensions: 4.12.2
websockets: 15.0.1
Severity
I can work around it