langchain
langchain copied to clipboard
Custom Calculator Tool
Reposting from Discord Thread:
Hey y'all! I'm trying to hack the CustomCalculatorTool so that I can pass in an LLM with a pre-loaded API key (I have a use case where I need to use seperate LLM instances with their own API keys). This is what I got so far:
llm2 = ChatOpenAI(temperature=0, openai_api_key=openai_api_key2)
class CalculatorInput(BaseModel):
query: str = Field(description="should be a math expression")
# api_key: str = Field(description="should be a valid OpenAI key")
llm: ChatOpenAI = Field(description="should be a valid ChatOpenAI")
class CustomCalculatorTool(BaseTool):
name = "Calculator"
description = "useful for when you need to answer questions about math"
args_schema=CalculatorInput
def _run(self, query: str, llm: ChatOpenAI) -> str:
"""Use the tool."""
llm_chain = LLMMathChain(llm=llm, verbose=True)
return llm_chain.run(query)
async def _arun(self, query: str) -> str:
"""Use the tool asynchronously."""
raise NotImplementedError("BingSearchRun does not support async")
tools = [CustomCalculatorTool()]
agent = initialize_agent(tools, llm1, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)
agent.run(query="3+3",llm=llm2)```
Notice the seperate LLM's. I get an error about `ValueError: Missing some input keys: {'input'}`
I guess, is my logic for passing API keys to LLM's correct here? I'm not super familiar with pydantic, but I've tried a few things and I get errors that complain about `ValueError: run supports only one positional argument.` or that later on when I invoke this in a custom class (I took a step back to work out the docs example)
I see a lot of the pre-made tools use a wrapper to contain the llm:
```class WikipediaQueryRun(BaseTool):
"""Tool that adds the capability to search using the Wikipedia API."""
name = "Wikipedia"
description = (
"A wrapper around Wikipedia. "
"Useful for when you need to answer general questions about "
"people, places, companies, historical events, or other subjects. "
"Input should be a search query."
)
api_wrapper: WikipediaAPIWrapper
def _run(self, query: str) -> str:
"""Use the Wikipedia tool."""
return self.api_wrapper.run(query)
async def _arun(self, query: str) -> str:
"""Use the Wikipedia tool asynchronously."""
raise NotImplementedError("WikipediaQueryRun does not support async")```
I tried implementing my own but it's not working great:
```class CustomCalculatorWrapper(BaseModel):
"""Wrapper around CustomCalculator.
"""
name: str = "CustomCalculator"
description = "A wrapper around CustomCalculator."
api_key: str
llm_math_chain: Any #: :meta private:
class Config:
"""Configuration for this pydantic object."""
extra = Extra.forbid
@root_validator()
def validate_environment(cls, values: Dict) -> Dict:
"""Validate that api key and python package exists in environment."""
api_key = get_from_dict_or_env(
values, "api_key", "api_key"
)
print("api_key", api_key)
values["api_key"] = api_key
print(values)
try:
llm = LLMChatWrapper(values["api_key"])
llm_math_chain = LLMMathChain(llm=llm.llmchat, verbose=True)
except:
print("Your LLM won't load bro")
values["llm_math_chain"] = llm_math_chain
return values
def run(self, query: str) -> str:
"""Use the tool."""
print("input to _run inside of wrapper class", query)
return self.llm_math_chain.run(query)```
I'm able to run it just fine using `CustomCalculatorWrapper(api_key=openai_api_key).run("3+3")`, but when I try to give it to my agent like this:
```agent = initialize_agent(CustomCalculatorTool(CustomCalculatorWrapper(openai_api_key)), llm1, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)```
I get `TypeError: __init__() takes exactly 1 positional argument (2 given)`
My custom calculator class looks like this:
```class CustomCalculatorTool(BaseTool):
name = "Calculator"
description = "useful for when you need to answer questions about math"
args_schema = CalculatorInput
wrapper = CustomCalculatorWrapper
def _run(self, query: str) -> str:
"""Use the tool."""
print("input to _run inside of custom tool", query)
return self.wrapper.run(query)
async def _arun(self, query: str) -> str:
"""Use the tool asynchronously."""
raise NotImplementedError("BingSearchRun does not support async")```
BaseModel does not take any args, only kwargs:
__init__(__pydantic_self__, **data: Any) -> None
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
In other words, you need to create an __init__ method for your custom tool, something like:
def __init__(self, wrapper):
super().__init__()
self.wrapper = wrapper
That worked! For anyone else that comes along this, here's my full custom class
openai_api_key: str = ""
llm: LLMChatWrapper = None
llm_math_chain: LLMMathChain = None
def __init__(self, openai_api_key):
super().__init__()
self.args_schema = CalculatorInput
self.llm = LLMChatWrapper(openai_api_key)
self.llm_math_chain = LLMMathChain(llm=self.llm.llmchat, verbose=True)
name = "Calculator"
description = "useful for when you need to answer questions about math"
def _run(self, query: str) -> str:
"""Use the tool."""
return self.llm_math_chain.run(query)
async def _arun(self, query: str) -> str:
"""Use the tool asynchronously."""
raise NotImplementedError("BingSearchRun does not support async")```