litellm.AuthenticationError: AuthenticationError: OpenAIException - UNAUTHENTICATED
`Error minimize expand_all Text litellm.AuthenticationError: AuthenticationError: OpenAIException - UNAUTHENTICATED Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 969, in async_streaming headers, response = await self.make_openai_chat_completion_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 135, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 436, in make_openai_chat_completion_request raise e File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 418, in make_openai_chat_completion_request await openai_aclient.chat.completions.with_raw_response.create( File "/opt/venv-a0/lib/python3.12/site-packages/openai/_legacy_response.py", line 381, in wrapped return cast(LegacyAPIResponse[R], await func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2589, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1794, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1594, in request raise self._make_status_error_from_response(err.response) from None openai.AuthenticationError: Error code: 401 - {'message': 'UNAUTHENTICATED', 'success': False}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
8 stack lines skipped <<<
File "/a0/models.py", line 499, in unified_call _completion = await acompletion( ^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/utils.py", line 1586, in wrapper_async raise e File "/opt/venv-a0/lib/python3.12/site-packages/litellm/utils.py", line 1437, in wrapper_async result = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/main.py", line 563, in acompletion raise exception_type( ^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type raise e File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 456, in exception_type raise AuthenticationError( litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - UNAUTHENTICATED
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - UNAUTHENTICATED`
After installed using docs and used working api keys and tested them before using
Seems like misconfiguration of model provider and api key. Can you check those?