langchain
langchain copied to clipboard
None type not checked before adding UsageMetadata value in AIMessageChunk when using LLM streaming
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
client = ChatOpenAI(
api_key=API_KEY,
base_url=PORTKEY_GATEWAY_URL,
streaming=streaming,
default_headers=portkey_headers,
model=api_model_id,
temperature=options.temperature,
n=options.n,
max_tokens=options.maxTokens,
)
messages = [HumanMessage(content='Some question')]
client.stream(messages)
Error Message and Stack Trace (if applicable)
| File "/Users/user/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 411, in stream
| raise e
| File "/Users/user/app/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 402, in stream
| generation += chunk
| File "/Users/user/app/.venv/lib/python3.12/site-packages/langchain_core/outputs/chat_generation.py", line 100, in __add__
| message=self.message + other.message,
| ~~~~~~~~~~~~~^~~~~~~~~~~~~~~
| File "/Users/user/app/.venv/lib/python3.12/site-packages/langchain_core/messages/ai.py", line 308, in __add__
| return add_ai_message_chunks(self, other)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/Users/user/app/.venv/lib/python3.12/site-packages/langchain_core/messages/ai.py", line 360, in add_ai_message_chunks
| usage_metadata_["total_tokens"] += other.usage_metadata["total_tokens"]
| TypeError: unsupported operand type(s) for +=: 'int' and 'NoneType'
+------------------------------------
Description
I'm using the ChatOpenAI class to stream an LLM output or OpenAI compatible API endpoints. In my case when calling an anthropic model (and possibly others) an exception is thrown since other.usage_metadata["total_tokens"] is None.
# Token usage
if left.usage_metadata or any(o.usage_metadata is not None for o in others):
usage_metadata_: UsageMetadata = left.usage_metadata or UsageMetadata(
input_tokens=0, output_tokens=0, total_tokens=0
)
for other in others:
if other.usage_metadata is not None:
usage_metadata_["input_tokens"] += other.usage_metadata["input_tokens"]
usage_metadata_["output_tokens"] += other.usage_metadata[
"output_tokens"
]
usage_metadata_["total_tokens"] += other.usage_metadata["total_tokens"]
usage_metadata: Optional[UsageMetadata] = usage_metadata_
else:
usage_metadata = None
I think we should check for None values before attempting to add to the existing UsageMetadata like so:
# Token usage
if left.usage_metadata or any(o.usage_metadata is not None for o in others):
usage_metadata_: UsageMetadata = left.usage_metadata or UsageMetadata(
input_tokens=0, output_tokens=0, total_tokens=0
)
for other in others:
if other.usage_metadata is not None:
if other.usage_metadata.get("input_tokens") is not None:
usage_metadata_["input_tokens"] += other.usage_metadata["input_tokens"]
if other.usage_metadata.get("output_tokens") is not None:
usage_metadata_["output_tokens"] += other.usage_metadata["output_tokens"]
if other.usage_metadata.get("total_tokens") is not None:
usage_metadata_["total_tokens"] += other.usage_metadata["total_tokens"]
usage_metadata: Optional[UsageMetadata] = usage_metadata_
else:
usage_metadata = None
System Info
langchain-openai version: ^0.1.23 Platform: mac python version: 3.12.0
Could you include full system information?
python -m langchain_core.sys_info
Yes here is my system info:
System Information
OS: Darwin OS Version: Darwin Kernel Version 23.5.0: Wed May 1 20:12:58 PDT 2024; root:xnu-10063.121.3~5/RELEASE_ARM64_T6000 Python Version: 3.12.0 (main, Apr 21 2024, 02:41:04) [Clang 15.0.0 (clang-1500.3.9.4)]
@eyurtsev I double checked it and was able to replicate the issue. Have created a small PR for fixing it.
@TheAppCrafter Issue has been fixed
@keenborder786 What was the fix for this issue? This pull request that was just closed above shows no files were changed https://github.com/langchain-ai/langchain/pull/26419
@eyurtsev Would you be able to help with this issue? It seems like a trivial fix. Some openai compatible APIs do not return token metadata but langchains current implementation assumes the token metadata is always returned which causes an exception to throw. We just need to check if input_tokens, output_tokens , total_tokens are None before adding to usage_metadata_. I cannot use this langchain in a production application because of this.
Hi, @TheAppCrafter. I'm Dosu, and I'm helping the LangChain team manage their backlog. I'm marking this issue as stale.
Issue Summary:
- You reported a
TypeErrorin LangChain due to aNoneTypenot being checked before addingUsageMetadata. - The issue persists even after updating to the latest version.
- User keenborder786 confirmed the issue and attempted a fix, but the pull request was closed without changes.
- You suggested adding a check for
Nonevalues to address the issue, noting its impact on production.
Next Steps:
- Please let me know if this issue is still relevant to the latest version of LangChain by commenting here.
- If there is no further activity, I will automatically close this issue in 7 days.
Thank you for your understanding and contribution!
Not stale
@eyurtsev, the user @TheAppCrafter has indicated that the issue regarding the TypeError is still relevant and needs attention. Could you please assist them with this?
I just ran into this issue, not stale
I think this is still a bug, in my opinion an exception should not be thrown if token counts are not returned. However, I figured out that you can explicitly ask for metadata such as token usage to be returned by setting stream_usage=True which seems to have fixed the issue for me.
ex.
return ChatOpenAI(
api_key=API_KEY,
base_url=BASE_URL,
streaming=streaming,
default_headers=HEADERS,
model=api_model_id,
temperature=options.temperature,
n=options.n,
max_tokens=options.maxTokens
stream_usage=True,
)
We encountered a TypeError specifically with the claude-3-7-sonnet-20250219 model.
The error occurred only in the Lambda environment using fastapi and aws-lambda-web-adapter - local works fine.
We resolved this by removing old version constraints (from langchain_anthropic==0.3.10)
The issue still exists in the latest LangChain
File "***/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 956, in _generate
return self._create_chat_result(response, generation_info)
ā ā ā ā None
ā ā ā ChatCompletion(id=None, choices=[Choice(finish_reason='tool_calls', index=None, logprobs=None, message=ChatCompletionMessage(...
ā ā <function AzureChatOpenAI._create_chat_result at 0x7f7685d40c20>
ā AzureChatOpenAI(callbacks=[<app.services.llm_service.LlmLogger object at 0x7f766451abd0>, <app.services.llm_service.DelayHand...
File "***/lib/python3.12/site-packages/langchain_openai/chat_models/azure.py", line 715, in _create_chat_result
chat_result = super()._create_chat_result(response, generation_info)
ā ā None
ā ChatCompletion(id=None, choices=[Choice(finish_reason='tool_calls', index=None, logprobs=None, message=ChatCompletionMessage(...
File "***/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1003, in _create_chat_result
message.usage_metadata = _create_usage_metadata(token_usage)
ā ā ā ā {'completion_tokens': None, 'prompt_tokens': 3624, 'total_tokens': 4064, 'completion_tokens_details': None, 'prompt_tokens_de...
ā ā ā <function _create_usage_metadata at 0x7f7685d07d80>
ā ā None
ā AIMessage(content='', additional_kwargs={'tool_calls': [{'id': '', 'function': {'arguments': '{"keyword":"***"}', 'name'...
File "***/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 2866, in _create_usage_metadata
total_tokens = oai_token_usage.get("total_tokens", input_tokens + output_tokens)
ā ā ā ā None
ā ā ā 3624
ā ā <method 'get' of 'dict' objects>
ā {'completion_tokens': None, 'prompt_tokens': 3624, 'total_tokens': 4064, 'completion_tokens_details': None, 'prompt_tokens_de...
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'
For now, Iām using an ugly workaround by manually patching the library code. Hoping for an official fix soon.
What's your openai version?
python -m langchain_core.sys_info
What's your openai version?
python -m langchain_core.sys_info
Sure!
System Information
------------------
> OS: Linux
> OS Version: #5.15.120.bsk.3 SMP Debian 5.15.120.bsk.3 Fri Dec 8 12:41:09 UTC
> Python Version: 3.12.10 (main, Apr 9 2025, 04:03:51) [Clang 20.1.0 ]
Package Information
-------------------
> langchain_core: 0.3.59
> langchain: 0.3.25
> langchain_community: 0.3.23
> langsmith: 0.3.26
> langchain_deepseek: 0.1.3
> langchain_openai: 0.3.12
> langchain_text_splitters: 0.3.8
> langgraph_sdk: 0.1.61
Optional packages not installed
-------------------------------
> langserve
Other Dependencies
------------------
> aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
> async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
> dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
> httpx: 0.28.1
> httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
> jsonpatch<2.0,>=1.33: Installed. No version info available.
> langchain-anthropic;: Installed. No version info available.
> langchain-aws;: Installed. No version info available.
> langchain-azure-ai;: Installed. No version info available.
> langchain-cohere;: Installed. No version info available.
> langchain-community;: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.47: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.49: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.51: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.56: Installed. No version info available.
> langchain-core<1.0.0,>=0.3.58: Installed. No version info available.
> langchain-deepseek;: Installed. No version info available.
> langchain-fireworks;: Installed. No version info available.
> langchain-google-genai;: Installed. No version info available.
> langchain-google-vertexai;: Installed. No version info available.
> langchain-groq;: Installed. No version info available.
> langchain-huggingface;: Installed. No version info available.
> langchain-mistralai;: Installed. No version info available.
> langchain-ollama;: Installed. No version info available.
> langchain-openai;: Installed. No version info available.
> langchain-openai<1.0.0,>=0.3.9: Installed. No version info available.
> langchain-perplexity;: Installed. No version info available.
> langchain-text-splitters<1.0.0,>=0.3.8: Installed. No version info available.
> langchain-together;: Installed. No version info available.
> langchain-xai;: Installed. No version info available.
> langchain<1.0.0,>=0.3.24: Installed. No version info available.
> langsmith-pyo3: Installed. No version info available.
> langsmith<0.4,>=0.1.125: Installed. No version info available.
> langsmith<0.4,>=0.1.17: Installed. No version info available.
> numpy>=1.26.2;: Installed. No version info available.
> numpy>=2.1.0;: Installed. No version info available.
> openai-agents: Installed. No version info available.
> openai<2.0.0,>=1.68.2: Installed. No version info available.
> opentelemetry-api: Installed. No version info available.
> opentelemetry-exporter-otlp-proto-http: Installed. No version info available.
> opentelemetry-sdk: Installed. No version info available.
> orjson: 3.10.16
> packaging: 24.2
> packaging<25,>=23.2: Installed. No version info available.
> pydantic: 2.11.1
> pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
> pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
> pydantic<3.0.0,>=2.7.4: Installed. No version info available.
> pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
> pytest: Installed. No version info available.
> PyYAML>=5.3: Installed. No version info available.
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> requests<3,>=2: Installed. No version info available.
> rich: 14.0.0
> SQLAlchemy<3,>=1.4: Installed. No version info available.
> tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
> tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
> tiktoken<1,>=0.7: Installed. No version info available.
> typing-extensions>=4.7: Installed. No version info available.
> zstandard: 0.23.0
The OpenAI version is v1.70.0, which doesn't contain in the log above.
@hv0905 I will create a fix for it.
@hv0905 This has been fixed. I think so we can close the issue. Just upgrade to latest version of langchain_core and langchain_openai......
- This issue should've been resolved quite a while back.
- users can upgrade to the latest version of langchain_core and their favorite chat model provider.
- If you encounter this issue again, please include the output of
python langchain_core.sys_info. - Please do NOT attempt to fix in langchain-core by accepting None types. The fixes need to be in the chat model provider implementations.