langchain
langchain copied to clipboard
Is langchain able to process batch prompt?
Sorry to disturb,I wonder if langchain could process a batch of prompts or it just process each text by calling llm(text)?
Currently just goes text by text. Should be easy to expose a batch method though.
what’s the use case? Is it just to add it onto the LLM class or do you want it as part of a chain as well?
Every text in a batch is independent with the same prompt templet, but it is slow to call llm() text by text now.So I wonder if there exits better method to get all the results faster :)
Every text in a batch is independent with the same prompt templet, but it is slow to call llm() text by text now.So I wonder if there exits better method to get all the results faster :)
You can ask LLM to process your list of entities one by one, so you are “equivalently” getting the batch result, but you are also bearing the risk of getting unwanted output on a list of large number of elements.
there is now a generate endpoint, which LLMs can use to batch requests: https://langchain.readthedocs.io/en/latest/examples/prompts/llm_functionality.html
the batching is implemented only for openai at the moment
what LLM provider are you using?
Hi, @Bookraint! I'm here to help the LangChain team manage their backlog and I wanted to let you know that we are marking this issue as stale.
From what I understand, you were asking if LangChain can process a batch of prompts or if it can only process one text at a time. hwchase17 responded that currently LangChain only goes text by text, but it should be easy to expose a batch method. You mentioned that you have multiple texts with the same prompt template and calling llm() text by text is slow. tonyabracadabra suggested asking LLM to process the list of entities one by one, but warned about the risk of getting unwanted output.
However, there seems to be a resolution to the issue. hwchase17 mentioned that there is now a generate endpoint for batching requests, although it is currently only implemented for OpenAI.
Now, we would like to know if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project!