litellm
litellm copied to clipboard
[Bug]: TypeError: can only concatenate str (not "dict") to str
What happened?
environment
- autogen 0.4
- litellm 1.53.1
- ollama version is 0.3.14
- ollama model is qwen2.5:14b-instruct-q4_K_M.
Infomation
I use autogen+litellm+ollama for my local test.
when doing some tool_call, litellm raise ERROR in token_counter method : TypeError: can only concatenate str (not "dict") to str.
I debug it with vscode , I found the function_arguments is dict , not str.
Can some one help check it is a bug ? Can just change it from (which at line 1638 of utils.py file): text += function_arguments to text += str(function_arguments)
Relevant log output
Error processing publish message
Traceback (most recent call last):
File ".venv/lib/python3.11/site-packages/litellm/main.py", line 481, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 612, in ollama_acompletion
raise e # don't use verbose_logger.exception, if exception is raised
^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 594, in ollama_acompletion
prompt_tokens = response_json.get("prompt_eval_count", litellm.token_counter(messages=data["messages"])) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1638, in token_counter
text += function_arguments
TypeError: can only concatenate str (not "dict") to str
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File ".venv/lib/python3.11/site-packages/autogen_core/application/_single_threaded_agent_runtime.py", line 402, in _process_publish
await asyncio.gather(*responses)
File ".venv/lib/python3.11/site-packages/autogen_core/application/_single_threaded_agent_runtime.py", line 394, in _on_message
return await agent.on_message(
^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/autogen_core/components/_routed_agent.py", line 484, in on_message
return await h(self, message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/autogen_core/components/_routed_agent.py", line 148, in wrapper
return_value = await func(self, message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/autogen_service/ag_core/hand_offs.py", line 57, in handle_task
llm_result = await self._model_client.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "src/autogen_service/ag_exts/models/litellm/_litellm_client.py", line 432, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1175, in wrapper_async
raise e
File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1031, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/main.py", line 503, in acompletion
raise exception_type(
^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2136, in exception_type
raise e
File ".venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2112, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: can only concatenate str (not "dict") to str
Traceback (most recent call last):
File ".venv/lib/python3.11/site-packages/litellm/main.py", line 481, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 612, in ollama_acompletion
raise e # don't use verbose_logger.exception, if exception is raised
^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 594, in ollama_acompletion
prompt_tokens = response_json.get("prompt_eval_count", litellm.token_counter(messages=data["messages"])) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1638, in token_counter
text += function_arguments
TypeError: can only concatenate str (not "dict") to str
Twitter / LinkedIn details
No response
can you share the request being made to litellm for repro?
I also got the same issue using ollama_chat/llama3.1 running locally (Meta Llama 3.1 8B Instruct). Interestingly ollama/llama3.1 did not have the same issue - I expect because it does not track consumed tokens.
Adding str(function_arguments) solved the issue.
can you share the request being made to litellm for repro?
@krrishdholakia yes . I make a simple demo to reproduce this error. please check itl,Thank you.
I think we're hitting this too as a downstream user of litellm - see https://github.com/robusta-dev/holmesgpt/issues/246 Any ETA on getting it fixed?
Fix merged into today's dev branch - should be in main by EOD