langchain
langchain copied to clipboard
Token usage calculation is not working for Asynchronous requests in ChatOpenA
Token usage calculation is not working:
streaming API not response this
Hi, @sammichenVV! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you reported an issue regarding the token usage calculation not functioning correctly for asynchronous requests in ChatOpenA. Imccccc commented on the issue, mentioning that the streaming API is not responding to this problem.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your contribution to the LangChain repository!