langchain
langchain copied to clipboard
KeyError: 'tool_call_id' when there is no Tool message in given chat history (Only System Message)
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
testTool = TestTool()
tools = [
testTool]
def createRunnableWithModelName(modelname: str):
model = currentModel
toolCallingModel = model.bind(tools=tools)
toolCallingPromptMessages = [
SystemMessagePromptTemplate(
prompt=PromptTemplate(
input_variables=[], template="insturctions", template_format="jinja2"
)
),
MessagesPlaceholder(
variable_name='chat_history', optional=True
),
HumanMessagePromptTemplate(
prompt=PromptTemplate(
input_variables=['input'], template='{input}'
)
),
MessagesPlaceholder(
variable_name='agent_scratchpad'
)
]
prompt = ChatPromptTemplate(
messages=toolCallingPromptMessages,
)
agent = create_tool_calling_agent(toolCallingModel, tools, prompt)
agentExecutor = AgentExecutor(
agent=agent,
tools=tools,
return_intermediate_steps=True
)
executor = agentExecutor.with_types(input_type=InputChat)
return executor
### Error Message and Stack Trace (if applicable)
File "projectPath/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
await self.middleware_stack(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
raise exc
File "projectPath/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
await self.app(scope, receive, _send)
File "projectPath/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__
await self.simple_response(scope, receive, send, request_headers=headers)
File "projectPath/lib/python3.11/site-packages/starlette/middleware/cors.py", line 144, in simple_response
await self.app(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "projectPath/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "projectPath/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
await self.middleware_stack(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
await route.handle(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "projectPath/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "projectPath/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "projectPath/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
response = await f(request)
^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/langserve/server.py", line 583, in stream_log
return await api_handler.stream_log(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/langserve/api_handler.py", line 1237, in stream_log
config, input_ = await self._get_config_and_input(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/langserve/api_handler.py", line 841, in _get_config_and_input
input_ = schema.model_validate(body.input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/pydantic/main.py", line 596, in model_validate
return cls.__pydantic_validator__.validate_python(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/langchain_core/messages/tool.py", line 130, in __init__
super().__init__(content=content, **kwargs)
File "projectPath/lib/python3.11/site-packages/langchain_core/messages/base.py", line 76, in __init__
super().__init__(content=content, **kwargs)
File "projectPath/lib/python3.11/site-packages/langchain_core/load/serializable.py", line 111, in __init__
super().__init__(*args, **kwargs)
File "projectPath/lib/python3.11/site-packages/pydantic/main.py", line 212, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "projectPath/lib/python3.11/site-packages/langchain_core/messages/tool.py", line 122, in coerce_args
tool_call_id = values["tool_call_id"]
~~~~~~^^^^^^^^^^^^^^^^
KeyError: 'tool_call_id'
Description
After upgrading to the latest version of all langhcain packages I started to get this error with my langserve server. This error thrown when chat history is not empty and has a system message. Parser tries to parse tool message from system message and throws key error because of that. (I confirm that because I tried the map that it tries to get 'tool_call_id' and it is the system message that I gave)
Error both exists while trying from playground and from my own client.
System Info
System Information
OS: Darwin OS Version: Darwin Kernel Version 24.0.0: Mon Aug 12 20:52:18 PDT 2024; root:xnu-11215.1.10~2/RELEASE_ARM64_T8122 Python Version: 3.11.10 (main, Sep 7 2024, 01:03:31) [Clang 15.0.0 (clang-1500.3.9.4)]
Package Information
langchain_core: 0.3.10 langchain: 0.3.3 langchain_community: 0.3.2 langsmith: 0.1.133 langchain_astradb: 0.5.0 langchain_cli: 0.0.31 langchain_openai: 0.2.2 langchain_text_splitters: 0.3.0 langserve: 0.3.0
Optional packages not installed
langgraph
Other Dependencies
aiohttp: 3.10.9 astrapy: 1.5.2 async-timeout: Installed. No version info available. dataclasses-json: 0.6.7 fastapi: 0.112.4 gitpython: 3.1.43 gritql: 0.1.5 httpx: 0.27.2 jsonpatch: 1.33 langserve[all]: Installed. No version info available. numpy: 1.26.4 openai: 1.51.2 orjson: 3.10.7 packaging: 24.1 pydantic: 2.9.2 pydantic-settings: 2.5.2 PyYAML: 6.0.2 requests: 2.32.3 requests-toolbelt: 1.0.0 SQLAlchemy: 2.0.35 sse-starlette: 1.8.2 tenacity: 8.5.0 tiktoken: 0.8.0 tomlkit: 0.12.5 typer[all]: Installed. No version info available. typing-extensions: 4.12.2 uvicorn: 0.23.2
I am not sure what model you are using. But I tested the following code and everything seems to be working fine:
from langchain_core.tools import tool
from langchain_openai.chat_models import ChatOpenAI
from langchain_core.prompts import SystemMessagePromptTemplate, MessagesPlaceholder, HumanMessagePromptTemplate,ChatPromptTemplate
from langchain.agents.tool_calling_agent.base import create_tool_calling_agent
from langchain.agents import AgentExecutor
@tool
def TestTool(tool_input:str):
"""Print Hello"""
return "hello"
tools = [TestTool]
def createRunnableWithModelName():
model = ChatOpenAI()
toolCallingModel = model.bind(tools=tools)
toolCallingPromptMessages = [
SystemMessagePromptTemplate(
prompt=PromptTemplate(
input_variables=[], template="You are a agent", template_format="jinja2"
)
),
MessagesPlaceholder(
variable_name='chat_history', optional=True
),
HumanMessagePromptTemplate(
prompt=PromptTemplate(
input_variables=['input'], template='{input}'
)
),
MessagesPlaceholder(
variable_name='agent_scratchpad'
)
]
prompt = ChatPromptTemplate(
messages=toolCallingPromptMessages,
)
agent = create_tool_calling_agent(toolCallingModel, tools, prompt)
agentExecutor = AgentExecutor(
agent=agent,
tools=tools,
return_intermediate_steps=True
)
executor = agentExecutor.with_types(input_type=str)
return executor
model = createRunnableWithModelName()
print(model.invoke({'input':'Execute Test Tool'}))
class InputChat(BaseModel):
"""Input for the chat endpoint."""
chat_history: List[Union[HumanMessage, AIMessage, SystemMessage, FunctionMessage, ToolMessage]] = Field(
...,
description="The chat messages representing the current conversation.",
)
input: str = Field(
...,
description="The user's input message.",
)
this is my input chat class, removing the union fixes the issue. So the new type is:
class InputChat(BaseModel):
"""Input for the chat endpoint."""
chat_history: List = Field(
...,
description="The chat messages representing the current conversation.",
)
input: str = Field(
...,
description="The user's input message.",
)
I could use this as a temporary workaround but this started to happen after update
The error happens, before even reaching the llm call (in pydantic validations) Trying directly with llm works, but API call (langserve) not working.
I have the same error, it started happening after updating the packages and moving to pydantic v2.
Same, looks like the validator that coerces types and runs 'before' anything else expects the raw values to have a "tool_call_id" key:
File "projectPath/lib/python3.11/site-packages/langchain_core/messages/tool.py", line 122, in coerce_args
tool_call_id = values["tool_call_id"]
but this is not the case for human or ai messages. So if you try to parse:
[{"type": "human", "content": "Hello"}]
it fails even though it is a perfectly valid instance of HumanMessage
Would a simple change to values.get("tool_call_id") in this line https://github.com/langchain-ai/langchain/blob/242e9fc8659d34933f30d8ed87f9d00f9909657e/libs/core/langchain_core/messages/tool.py#L122
solve it?
Has there been any update on this issue? I just ran into it myself.
+1
+1
+1
@meliascosta
Would a simple change to values.get("tool_call_id") in [langchain/libs/core/langchain_core/messages/tool.py] solve it?
Here's a simple monkey patch for solving this issue:
Replace
tool_call_id = values["tool_call_id"]
with
tool_call_id = values.get("tool_call_id")
If you're using a chat model / agent that involves tool calling, use ToolMessage and not FunctionMessage. FunctionMessage is a legacy primitive -- it was introduced back when OpenAI used function messages instead of tool messages.
class InputChat(BaseModel):
"""Input for the chat endpoint."""
chat_history: List[Union[HumanMessage, AIMessage, SystemMessage, ToolMessage]] = Field(
...,
description="The chat messages representing the current conversation.",
)
input: str = Field(
...,
description="The user's input message.",
)
If you're still encountering an issue, please include a fully reproducible example, so a maintainer can run it and verify the underlying issue.
If you are still facing this issue, the error might be due to a schema validation error caused by the extractor.
I'm getting this error when I try to instantiate my own ToolMessage... Not really sure if it is good practice to generate an id and add it to the value.
In the first go, you don't have to have tool_call_id, but in the subsequent call, you will add tool_call_id to the state.
Would a simple change to values.get("tool_call_id") in [langchain/libs/core/langchain_core/messages/tool.py] solve it?
Here's a simple monkey patch for solving this issue:
Replace
tool_call_id = values["tool_call_id"] with
tool_call_id = values.get("tool_call_id") in langchain/libs/core/langchain_core/messages/tool.py.
Fixes the issue for me for the exact reasons that were mentionned. Is there a reason the get method call was not used in the official Langchain package?