litellm
litellm copied to clipboard
Anthropic Claude 2 LLM model does not work with LiteLLM, API not supported
I have to use LiteLLM with TruLens to access anthropic API. I took an example from official documentation of LiteLLM to try to access Anthropic Claude 2 model and it gives errors.
How to reproduce:
from litellm import completion
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model="claude-3-opus-20240229", messages=messages)
print(response)
Error message
07:45:17 - LiteLLM:INFO:
POST Request Sent from LiteLLM: curl -X POST https://api.anthropic.com/v1/complete -H 'accept: application/json' -H 'anthropic-version: 2023-06-01' -H 'content-type: application/json' -H -d '{'model': 'claude-3-opus-20240229', 'prompt': "\n\nHuman: Hey! how's it going?\n\nAssistant: ", 'max_tokens_to_sample': 256}'
2024-03-13 07:45:17,915:INFO:
POST Request Sent from LiteLLM: curl -X POST https://api.anthropic.com/v1/complete -H 'accept: application/json' -H 'anthropic-version: 2023-06-01' -H 'content-type: application/json' -H -d '{'model': 'claude-3-opus-20240229', 'prompt': "\n\nHuman: Hey! how's it going?\n\nAssistant: ", 'max_tokens_to_sample': 256}'
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last): File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/main.py", line 1020, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 170, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":""claude-3-opus-20240229" is not supported on this API. Please use the Messages API instead."}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/utils.py", line 2481, in wrapper raise e File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/utils.py", line 2384, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/main.py", line 1897, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/utils.py", line 7520, in exception_type raise e File "/Users/aabor/projects/rag/venv/lib/python3.11/site-packages/litellm/utils.py", line 6478, in exception_type raise BadRequestError( litellm.exceptions.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":""claude-3-opus-20240229" is not supported on this API. Please use the Messages API instead."}}
This model works when I use other packages to access, for example, llama_index
from llama_index.llms.anthropic import Anthropic as LlamaIndexAnthropic