OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Litellm Anthropic method

Open OSH212 opened this issue 1 year ago • 2 comments

Describe the bug Launching OpenDevin with Anthropic model (via litellm) returns an error.

Steps to Reproduce

  1. set litellm proxy
  2. launch opendevin
  3. give it a prompt
  4. error is returned

Expected behavior expected to work. Actual behavior Doesn't work. Additional context ➜ OpenDevin git:(main) python3 -m uvicorn opendevin.server.listen:app --port 3000 INFO: Started server process [17111] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit) INFO: ('127.0.0.1', 58349) - "WebSocket /ws" [accepted] INFO: connection open

============== STEP 0

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

OSH212 avatar Mar 29 '24 18:03 OSH212

Best guess is that it's an API key issue

Can you share your exact environment variables and commands for setting up the proxy?

rbren avatar Mar 29 '24 19:03 rbren

@OSH212 Can you start litellm with --detailed_debug ? You'll be able to see debug logs on the error raised

  • What endpoint are you calling ? It would help if I can see the curl sent to the proxy

ishaan-jaff avatar Mar 29 '24 20:03 ishaan-jaff

Best guess is that it's an API key issue

Can you share your exact environment variables and commands for setting up the proxy?

@OSH212 Can you start litellm with --detailed_debug ? You'll be able to see debug logs on the error raised

  • What endpoint are you calling ? It would help if I can see the curl sent to the proxy

export LLM_API_KEY="" export LLM_MODEL="claude-3-opus-20240229" export LLM_BASE_URL="http://0.0.0.0:4000"


export ANTHROPIC_API_KEY=""

~ litellm --model anthropic/claude-3-opus --detailed_debug INFO: Started server process [21543] INFO: Waiting for application startup.

03:25:28 - LiteLLM Proxy:DEBUG: utils.py:33 - INITIALIZING LITELLM CALLBACKS! 03:25:28 - LiteLLM:DEBUG: utils.py:857 - callback: <litellm.proxy.hooks.max_budget_limiter._PROXY_MaxBudgetLimiter object at 0x10dab6490> 03:25:28 - LiteLLM:DEBUG: utils.py:857 - callback: <litellm.proxy.hooks.parallel_request_limiter._PROXY_MaxParallelRequestsHandler object at 0x10dab6410> 03:25:28 - LiteLLM:DEBUG: utils.py:857 - callback: <bound method ProxyLogging.response_taking_too_long_callback of <litellm.proxy.utils.ProxyLogging object at 0x10dab62d0>> 03:25:28 - LiteLLM:DEBUG: utils.py:857 - callback: <litellm.proxy.hooks.cache_control_check._PROXY_CacheControlCheck object at 0x10ca57190> 03:25:28 - LiteLLM Proxy:DEBUG: proxy_server.py:2750 - prisma_client: None 03:25:28 - LiteLLM Proxy:DEBUG: proxy_server.py:2754 - custom_db_client client - None 03:25:28 - LiteLLM Proxy:DEBUG: proxy_server.py:2805 - custom_db_client client None. Master_key: None INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit) INFO: 127.0.0.1:49450 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49451 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49452 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49453 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49454 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49455 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49456 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49457 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49458 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49459 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49460 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49461 - "POST / HTTP/1.1" 405 Method Not Allowed INFO: 127.0.0.1:49462 - "POST / HTTP/1.1" 405 Method Not Allowed


➜ OpenDevin git:(main) python3 -m uvicorn opendevin.server.listen:app --port 3000 INFO: Started server process [22328] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit) INFO: ('127.0.0.1', 49446) - "WebSocket /ws" [accepted] INFO: connection open

STEP 0

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

============== STEP 1

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

============== STEP 2

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

============== STEP 3

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

============== STEP 4

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

AGENT ERROR:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 1163, in completion response = anthropic.completion( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/anthropic.py", line 213, in completion raise AnthropicError( litellm.llms.anthropic.AnthropicError: {"detail":"Method Not Allowed"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/sho/OpenDevin/opendevin/controller/init.py", line 85, in step action = self.agent.step(state) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/sho/OpenDevin/agenthub/langchains_agent/init.py", line 172, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2796, in wrapper raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2693, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2093, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8283, in exception_type raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7238, in exception_type raise APIError( litellm.exceptions.APIError: AnthropicException - {"detail":"Method Not Allowed"}. Handle with litellm.APIError.

OBSERVATION:
AnthropicException - {"detail":"Method Not Allowed"}. Handle with `litellm.APIError`.

OSH212 avatar Mar 30 '24 08:03 OSH212

export LLM_BASE_URL="http://0.0.0.0:4000/"

This looks wrong. It should probably be export LLM_BASE_URL="http://127.0.0.1:4000/"

rbren avatar Mar 30 '24 14:03 rbren

Also--how are you serving Claude locally? Shouldn't BASE_URL be pointing to Anthropic's API?

rbren avatar Mar 30 '24 14:03 rbren

export LLM_BASE_URL="http://127.0.0.1:4000/"

you're right. My error was to use the local litellm url as base url. I updated the url to "https://api.anthropic.com/v1/messages" and it doesnt return the error anymore.

Thank you

OSH212 avatar Mar 30 '24 15:03 OSH212