langchain
langchain copied to clipboard
Use OllamaFunctions to build AgentExecutor but return errors with tools
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [ ] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
from langchain_experimental.llms.ollama_functions import OllamaFunctions
from langchain_core.tools import tool
from langchain_community.tools.tavily_search import TavilySearchResults
import os
from langchain.agents import AgentExecutor
from langchain.agents import create_tool_calling_agent
from langchain import hub
os.environ["TAVILY_API_KEY"] = ''
llm = OllamaFunctions(model="llama3:8b", temperature=0.6, format="json")
@tool
def multiply(first_int: int, second_int: int) -> int:
"""Multiply two integers together."""
return first_int * second_int
@tool
def search_in_web(query: str) -> str:
"""Use Tavily to search information in Internet."""
search = TavilySearchResults(max_results=2)
context = search.invoke(query)
result = ""
for i in context:
result += f"In site:{i['url']}, context shows:{i['content']}.\n"
return result
tools=[
{
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"first_int": {
"type": "integer",
"description": "The first integer number to be multiplied. " "e.g. 4",
},
"second_int": {
"type": "integer",
"description": "The second integer to be multiplied. " "e.g. 7",
},
},
"required": ["first_int", "second_int"],
},
},
{
"name": "search_in_web",
"description": "Use Tavily to search information in Internet.",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "str",
"description": "The query used to search in Internet. " "e.g. what is the weather in San Francisco?",
},
},
"required": ["query"],
},
}]
prompt = hub.pull("hwchase17/openai-functions-agent")
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools)
### Error Message and Stack Trace (if applicable)
---------------------------------------------------------------------------
Traceback (most recent call last):
File "/media/user/My Book/LLM/Agent/check.py", line 80, in <module>
agent_executor = AgentExecutor(agent=agent, tools=tools)
File "/home/user/anaconda3/envs/XIE/lib/python3.9/site-packages/pydantic/v1/main.py", line 339, in __init__
values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
File "/home/user/anaconda3/envs/XIE/lib/python3.9/site-packages/pydantic/v1/main.py", line 1100, in validate_model
values = validator(cls_, values)
File "/home/user/anaconda3/envs/XIE/lib/python3.9/site-packages/langchain/agents/agent.py", line 981, in validate_tools
tools = values["tools"]
KeyError: 'tools'
### Description
I am tring to initialize AgentExecutor with OllamaFunctions.
I have check the `OllamaFunctions.bind_tools` and it works well.
So I want to use AgentExecutor to let llm to response.
But `keyerror: tools ` confused me, since the `create_tool_calling_agent` use the same `tools` value.
Does anyone know how to fix this problem?
### System Info
langchain==0.2.1
langchain-community==0.0.38
langchain-core==0.2.1
langchain-experimental==0.0.58
langchain-openai==0.1.7
langchain-text-splitters==0.2.0
langchainhub==0.1.16
platform: Ubuntu 20.04
python==3.9