streamEvents doesn't return response_metadata
I'm not sure if this is a bug or my ignorance, but I have this code below where I'm using streamEvents and no matter what I do, response_metadata is always empty. If I do .stream() I can get it, but I'm struggling to still have the response stream to the front (may be a me issue). Is this a bug or is there a way to grab this info?
Essentially what I need to do is stream the data to the client end, and capture once it finishes to inspect things like errors or the finish reason.
const llm = new ChatVertexAI({
model: 'gemini-2.0-flash-001',
temperature: 0,
authOptions: {
credentials: vertexProviderSettings.googleAuthOptions.credentials,
projectId: vertexProviderSettings.project!
},
location: vertexProviderSettings.location
})
const agent = createReactAgent({
llm,
tools: [],
/**
* Modify the stock prompt in the prebuilt agent. See docs
* for how to customize your agent:
*
* https://langchain-ai.github.io/langgraphjs/tutorials/quickstart/
*/
messageModifier: new SystemMessage(AGENT_SYSTEM_TEMPLATE)
})
const eventStream = agent.streamEvents({ messages: body.messages }, { version: 'v2' })
Just to add, this is the python docs and this does not work with JavaScript version
Hello! I believe this to be a LangChain.js issue, since other LLM providers do output response metadata.
Transferring to LangChain.js (cc @afirstenberg)
@Godrules500 Can you try adding this?
const eventStream = agent.streamEvents(
{ messages: body.messages },
{
stream_options: {
include_usage: true
},
version: 'v2'
}
);
Investigation Findings
I investigated this issue and found that the core LangChain.js streamEvents implementation correctly handles response_metadata.
How it works:
- In
_streamIterator()(chat_models.ts),chunk.generationInfois merged intochunk.message.response_metadata -
AIMessageChunk.concat()properly mergesresponse_metadataviamergeResponseMetadata() - The
on_chat_model_endevent instreamEventsincludes the full message withresponse_metadata
Test verification:
I wrote a test that confirms response_metadata (including finish_reason, model_name, usage, etc.) flows through streamEvents correctly. See PR #9589.
Possible causes for your issue:
-
Looking at the wrong event:
response_metadatais available in theon_chat_model_endevent, not in intermediateon_chat_model_streamevents. Make sure you're checking: ```javascript for await (const event of eventStream) { if (event.event === 'on_chat_model_end') { console.log(event.data.output.response_metadata); } } ``` -
LangGraph's createReactAgent: Since you're using
createReactAgentfrom LangGraph, there may be additional layers that affect how events are surfaced. The agent wraps the LLM, so you might need to look for nested events. -
Vertex AI provider behavior: The Google Vertex AI provider does set
response_metadata, but it may have different content than OpenAI (e.g.,finish_reasonis ingenerationInfowhich gets merged intoresponse_metadata).
Could you try:
- Checking the
on_chat_model_endevent specifically forresponse_metadata - Using the LLM directly (without the agent) to verify if
response_metadataappears - Logging all events to see what's available
Hope this helps! Let me know if you're still seeing issues after checking these.
@nathannewyen thanks for contributing a test for this. Will go ahead and close.