langchain
langchain copied to clipboard
Question: Does API Chain support post method?
Does API Chain support post method? How we can call post external api with llm? Appreciate any information, thanks.
I also want to know how to support POST method, thank you
Commenting for followup.
I am also getting an error when doing a POST API call. Any ideas?
raise InvalidSchema(f"No connection adapters were found for {url!r}") requests.exceptions.InvalidSchema: No connection adapters were found for 'POST
Since the class TextRequestsWrapper has already implemented the post method, we need the llm to output the request method in the api docs and generate the request body if necessary. Here is my solution:
- new subclass of
APIChain(you could change the output as you wish)
import json
from templates.api import API_REQUEST_PROMPT, API_RESPONSE_PROMPT # change this to the path you placed the templates
from langchain.chains import APIChain
from typing import Any, Dict, Optional
from langchain.prompts import BasePromptTemplate
from langchain.requests import TextRequestsWrapper
from langchain.schema import BaseLanguageModel
from langchain.chains.llm import LLMChain
class PowerfulAPIChain(APIChain):
def _call(self, inputs: Dict[str, str]) -> Dict[str, str]:
question = inputs[self.question_key]
request_info = self.api_request_chain.predict(
question=question, api_docs=self.api_docs
)
print(f'request info: {request_info}')
api_url, request_method, body = request_info.split('|')
self.callback_manager.on_text(
api_url, color="green", end="\n", verbose=self.verbose
)
# get the method with same name
request_func = getattr(self.requests_wrapper, request_method.lower())
api_response = request_func(api_url, json.loads(body))
self.callback_manager.on_text(
api_response, color="yellow", end="\n", verbose=self.verbose
)
# answer = self.api_answer_chain.predict(
# question=question,
# api_docs=self.api_docs,
# api_url=api_url,
# api_response=api_response,
# )
# return {self.output_key: answer}
return {self.output_key: api_response}
@classmethod
def from_llm_and_api_docs(
cls,
llm: BaseLanguageModel,
api_docs: str,
headers: Optional[dict] = None,
api_url_prompt: BasePromptTemplate = API_REQUEST_PROMPT,
api_response_prompt: BasePromptTemplate = API_RESPONSE_PROMPT,
**kwargs: Any,
) -> APIChain:
"""Load chain from just an LLM and the api docs."""
get_request_chain = LLMChain(llm=llm, prompt=api_url_prompt)
requests_wrapper = TextRequestsWrapper(headers=headers)
get_answer_chain = LLMChain(llm=llm, prompt=api_response_prompt)
return cls(
api_request_chain=get_request_chain,
api_answer_chain=get_answer_chain,
requests_wrapper=requests_wrapper,
api_docs=api_docs,
**kwargs,
)
- new api templates (used in the subclass)
from langchain.prompts.prompt import PromptTemplate
API_URL_PROMPT_TEMPLATE = """You are given the below API Documentation:
{api_docs}
Using this documentation, generate the full API url to call for answering the user question.
You should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.
You should extract the request METHOD from doc, and generate the BODY data in JSON format according to the user question if necessary. The BODY data could be empty dict.
Question:{question}
"""
API_REQUEST_PROMPT_TEMPLATE = API_URL_PROMPT_TEMPLATE + """Output the API url, METHOD and BODY, join them with `|`. DO NOT GIVE ANY EXPLANATION."""
API_REQUEST_PROMPT = PromptTemplate(
input_variables=[
"api_docs",
"question",
],
template=API_REQUEST_PROMPT_TEMPLATE,
)
API_RESPONSE_PROMPT_TEMPLATE = (
API_URL_PROMPT_TEMPLATE
+ """API url: {api_url}
Here is the response from the API:
{api_response}
Summarize this response to answer the original question.
Summary:"""
)
API_RESPONSE_PROMPT = PromptTemplate(
input_variables=["api_docs", "question", "api_url", "api_response"],
template=API_RESPONSE_PROMPT_TEMPLATE,
)
If you need to async request the api, you could modify the _acall function in a similar way.
hello, any plans to support this? APIChain is really cool. Great job guys!
@RamanujanFan I'm trying to use your example, but the self.callback_manager is None for me. That's how I run it:
from langchain.llms import OpenAI
llm = OpenAI(temperature=0)
with open("openapi.json") as f:
api_docs = f.read()
chain = PowerfulAPIChain.from_llm_and_api_docs(llm=llm, api_docs=api_docs)
result = chain.run("What can I order?")
Does it make sense? It fails on the callback_manager.on_text being None
Thanks for this @RamanujanFan!
@smolendawid I had to comment out the callback manager. But after that I got it to work.
While this is not supported by the APIChain, I'd suggest having a look at get_openapi_chain, which is a SequentialChain that parses the OpenAPI specs and it does work with POST, PUT, etc.
https://github.com/langchain-ai/langchain/blob/539672a7fd2bf5c39013859205decd46d4427a29/libs/langchain/langchain/chains/openai_functions/openapi.py#L236
Hi, @luisxiaomai! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you were asking if API Chain supports the post method and how to call a post external API with llm. RamanujanFan suggested a solution by creating a new subclass of APIChain and provided some code examples. It seems that smolendawid and camgreenburg encountered some issues with the provided solution, but they were able to find workarounds. luish also suggested using get_openapi_chain as an alternative solution.
Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your contribution to the LangChain repository!
With the latest version (0.0.354), note the following:
from langchain.schema import BaseLanguageModel
should be changed to
from langchain_core.language_models.base import BaseLanguageModel
It would be very helpful to add this PowerfulAPIChain implementation to LangChain. The suggested OpenAPI spec implementation is very different. It requires the OpenAPI specification yaml file which may not exist. It also can be difficult to generate. The suggested PowerfulAPIChain implementation can solve this by describing the API in plain human language.
+1...the docs specifically state that APIChain makes GET, POST etc requests (see screenshot) but in the code there is only get see https://github.com/langchain-ai/langchain/blob/080af0ec5386f6b2d392e5587760dbf7344e4dec/libs/langchain/langchain/chains/api/base.py#L169
This leads me to speculate that it is for liability reasons that the functionality to POST etc is not enabled, but would be great to hear from LangChain if this functionality will ever be enabled. In the meanwhile we may have to use a modified fork of LangChain with PowerfulAPIChain-type code above (btw, thanks @RamanujanFan).
Thanks for a fantastic OSS library though LangChain... <3