aider icon indicating copy to clipboard operation
aider copied to clipboard

/model autocomplete confusion

Open Yona544 opened this issue 1 year ago • 1 comments

The model was set using /Model command

Unexpected error: litellm.NotFoundError: AnthropicException - b'{"type":"error","error":{"type":"not_found_error","message":"model: anthropic.claude-3-5-sonnet-20240620-v1:0"}}' Traceback (most recent call last): File "C:\Python311\Lib\site-packages\litellm\llms\anthropic\chat.py", line 725, in make_sync_call response = client.post( ^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 371, in post raise e File "C:\Python311\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 357, in post response.raise_for_status() File "C:\Python311\Lib\site-packages\httpx_models.py", line 763, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '404 Not Found' for url 'https://api.anthropic.com/v1/messages' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Python311\Lib\site-packages\litellm\main.py", line 1586, in completion response = anthropic_chat_completions.completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\llms\anthropic\chat.py", line 1117, in completion completion_stream = make_sync_call( ^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\llms\anthropic\chat.py", line 729, in make_sync_call raise AnthropicError( litellm.llms.anthropic.chat.AnthropicError: b'{"type":"error","error":{"type":"not_found_error","message":"model: anthropic.claude-3-5-sonnet-20240620-v1:0"}}'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Python311\Lib\site-packages\aider\coders\base_coder.py", line 1115, in send_message yield from self.send(messages, functions=self.functions) File "C:\Python311\Lib\site-packages\aider\coders\base_coder.py", line 1392, in send hash_object, completion = send_completion( ^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\aider\sendchat.py", line 87, in send_completion res = litellm.completion(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\utils.py", line 1086, in wrapper raise e File "C:\Python311\Lib\site-packages\litellm\utils.py", line 974, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\main.py", line 2847, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "C:\Python311\Lib\site-packages\litellm\utils.py", line 8194, in exception_type raise e File "C:\Python311\Lib\site-packages\litellm\utils.py", line 6656, in exception_type raise NotFoundError( litellm.exceptions.NotFoundError: litellm.NotFoundError: AnthropicException - b'{"type":"error","error":{"type":"not_found_error","message":"model: anthropic.claude-3-5-sonnet-20240620-v1:0"}}'

Aider version: 0.57.1 Python version: 3.11.4 Platform: Windows-10-10.0.19045-SP0 Python implementation: CPython Virtual environment: No OS: Windows 10 (64bit) Git version: git version 2.45.2.windows.1

Yona544 avatar Sep 22 '24 20:09 Yona544

Thank you for filing this issue.

The autocomplete for /model is confusing currently, as the whole LiteLLM database gets pulled in and some of those entries do not seem to work. The correct command to switch to Sonnet-3.5 inside aider is

 /model claude-3-5-sonnet-20240620 

Please try this command and report if this works you (and avoid anthropic.* model names for now)

For completeness sake: I assume you typed /model and then used autocomplete to complete the model name to

/model anthropic.claude-3-5-sonnet-20240620-v1:0

is that correct? (this would explain the strange model name)

fry69 avatar Sep 22 '24 21:09 fry69

I'm going to close this issue for now, but feel free to add a comment here and I will re-open. Or feel free to file a new issue any time.

paul-gauthier avatar Oct 07 '24 20:10 paul-gauthier

Edit: I have managed to solve the problem. Even though the documentation of crewAI says that these models are supported, apparently Anthropic API does not support them anymore :D

I've checked Anthropic's documentation:

Image

Reference: https://docs.claude.com/en/docs/about-claude/models/overview

and I've updated my code with the following:


# Set up the Claude Model
llm_claude = LLM(
    model="anthropic/claude-haiku-4-5-20251001",
    api_key=os.getenv("CLAUDE_API_KEY"),
    temperature=0.7,
    max_tokens=4096,  # Required for Anthropic
)


Problem

I have the same problem, when I try to call Claude models from crewAI:

# Set up the Claude Model
llm_claude = LLM(
    model="anthropic/claude-3-5-sonnet-20241022",
    api_key=os.getenv("CLAUDE_API_KEY"),
    temperature=0.7,
    max_tokens=4096,  # Required for Anthropic
)

This model is said to be supported by crewAI:

Image

Reference: https://docs.crewai.com/en/concepts/llms#anthropic


and the error that I receive is the following:

Traceback (most recent call last):
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/llms/anthropic/chat/handler.py", line 444, in completion
    response = client.post(
        api_base,
    ...<2 lines>...
        timeout=timeout,
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 780, in post
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/llms/custom_httpx/http_handler.py", line 762, in post
    response.raise_for_status()
    ~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/httpx/_models.py", line 829, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '404 Not Found' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/main.py", line 2147, in completion
    response = anthropic_chat_completions.completion(
        model=model,
    ...<15 lines>...
        custom_llm_provider=custom_llm_provider,
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/llms/anthropic/chat/handler.py", line 459, in completion
    raise AnthropicError(
    ...<3 lines>...
    )
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"not_found_error","message":"model: claude-3-5-sonnet-20241022"},"request_id":"req_011CUUUeR9Y5fPX4XgtXesaw"}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/tuna/git-repos/avlo/research_crew/test.py", line 182, in <module>
    result = crew.kickoff()
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/crew.py", line 661, in kickoff
    result = self._run_sequential_process()
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/crew.py", line 772, in _run_sequential_process
    return self._execute_tasks(self.tasks)
           ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/crew.py", line 875, in _execute_tasks
    task_output = task.execute_sync(
        agent=agent_to_use,
        context=context,
        tools=cast(List[BaseTool], tools_for_task),
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/task.py", line 382, in execute_sync
    return self._execute_core(agent, context, tools)
           ~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/task.py", line 530, in _execute_core
    raise e  # Re-raise the exception after emitting the event
    ^^^^^^^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/task.py", line 446, in _execute_core
    result = agent.execute_task(
        task=self,
        context=context,
        tools=tools,
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agent.py", line 475, in execute_task
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agent.py", line 451, in execute_task
    result = self._execute_without_timeout(task_prompt, task)
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agent.py", line 547, in _execute_without_timeout
    return self.agent_executor.invoke(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~^
        {
        ^
    ...<4 lines>...
        }
        ^
    )["output"]
    ^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 114, in invoke
    formatted_answer = self._invoke_loop()
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 208, in _invoke_loop
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 154, in _invoke_loop
    answer = get_llm_response(
        llm=self.llm,
    ...<3 lines>...
        from_task=self.task
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/utilities/agent_utils.py", line 160, in get_llm_response
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/utilities/agent_utils.py", line 153, in get_llm_response
    answer = llm.call(
        messages,
    ...<2 lines>...
        from_agent=from_agent,
    )
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/llm.py", line 1030, in call
    return self._handle_non_streaming_response(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        params, callbacks, available_functions, from_task, from_agent
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/crewai/llm.py", line 813, in _handle_non_streaming_response
    response = litellm.completion(**params)
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/utils.py", line 1330, in wrapper
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/utils.py", line 1205, in wrapper
    result = original_function(*args, **kwargs)
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/main.py", line 3427, in completion
    raise exception_type(
          ~~~~~~~~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
    ...<3 lines>...
        extra_kwargs=kwargs,
        ^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
    raise e
  File "/home/tuna/git-repos/avlo/research_crew/.venv/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 610, in exception_type
    raise NotFoundError(
    ...<3 lines>...
    )
litellm.exceptions.NotFoundError: litellm.NotFoundError: AnthropicException - {"type":"error","error":{"type":"not_found_error","message":"model: claude-3-5-sonnet-20241022"},"request_id":"req_011CUUUeR9Y5fPX4XgtXesaw"}

tunacinsoy avatar Oct 25 '25 19:10 tunacinsoy