langchain
langchain copied to clipboard
Azure OpenAPI sometimes returns message without text, but also missing "content" key. Langchain should handle it gracefuly.
System Info
langchain: 0.0.188 (but same would happen in the latest master too) python 3.10 Linux
Apparently, when model generates no text (it will depend on a specific set of prompts, messages, NO stopwords is used), Azure OpenAI API responds with something like this:
{
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"role": "assistant"
}
}
],
As you can see, "message" doesn't have "content" key at all. And LangChain expects that key to always be there and at one point there will be KeyError that "content" is not available. Not the best implementation of API from AOAI side, but we should handle it and raise some dedicated exception. The good place for it seems to be here: https://github.com/hwchase17/langchain/blob/master/langchain/chat_models/openai.py#L368 , (def _create_chat_result)
I've "fixed" it by doing this:
def _create_chat_result(self, response: Mapping[str, Any]) -> ChatResult:
generations = []
for res in response["choices"]:
if "content" not in res["message"]: #<--- checking it it's missing
raise EmptyResponseFromModel()# <---- raising custom exception that I can intercept in the main code and react appropriately
Who can help?
No response
Information
- [ ] The official example notebooks/scripts
- [ ] My own modified scripts
Related Components
- [X] LLMs/Chat Models
- [ ] Embedding Models
- [ ] Prompts / Prompt Templates / Prompt Selectors
- [ ] Output Parsers
- [ ] Document Loaders
- [ ] Vector Stores / Retrievers
- [ ] Memory
- [ ] Agents / Agent Executors
- [ ] Tools / Toolkits
- [ ] Chains
- [ ] Callbacks/Tracing
- [ ] Async
Reproduction
Can't share exact prompts and messages
Expected behavior
We should handle this broken response from the AOAI gracefully.
Answer generated by 🤖
Answer
Thank you for bringing this issue to our attention and proposing a solution. It's great to see your initiative in fixing the problem.
Handling the broken response from the Azure OpenAI API gracefully is indeed important for LangChain's robustness. Your proposed fix in the _create_chat_result
method looks like a good approach to handle the missing "content" key and raise a custom exception.
I encourage you to go ahead and create a pull request with your proposed solution. The maintainers will review your changes and determine if they should be merged into the LangChain codebase.
Thank you for your contribution to LangChain!
This response is meant to be helpful and save users time. It is not meant to be a precise solution.
Please leave a 👍 if this is helpful and 👎 if it is irrelevant
any update regarding this one ?
Hi, @berlogb! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, the issue is that Langchain raises a KeyError when the Azure OpenAI API returns a message without the "content" key. The suggested solution is to handle this gracefully by raising a custom exception in the _create_chat_result
function. Devstein has encouraged you to create a pull request with the proposed solution. Guyyakir has also asked for an update on the issue.
Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your contribution to the LangChain repository!