Last chat delta receive from `on_llm_new_delta` handler does not change to `complete` status
I'm using LLMChain with release fromv0.3.0-rc.1 for LiveView where function is run in an async process. Currently, my code is working well for ChatOpenAI model. However, when I switch model to ChatGoogleAI, the last chat delta receive from the on_llm_new_delta handler would still in incomplete status and does not change to complete status. Can anyone help on this issue? Thanks.
def handle_params(_params, _uri, socket) do
socket =
socket
|> assign(
:chain,
LLMChain.new!(%{
llm:
ChatGoogleAI.new!(%{
model: "gemini-2.0-flash-exp",
temperature: 0,
request_timeout: 60_000,
stream: true
}),
verbose: false
})
)
|> assign(:async_result, %AsyncResult{})
{:noreply, socket}
end
defp run_chain(socket) do
live_view_pid = self()
chain_handler = %{
on_tool_response_created: fn _chain, %Message{} = tool_message ->
send(live_view_pid, {:tool_executed, tool_message})
end
}
model_handler = %{
on_llm_new_delta: fn _model, delta ->
send(live_view_pid, {:chat_delta, delta})
end
}
chain =
socket.assigns.llm_chain
|> LLMChain.add_callback(chain_handler)
|> LLMChain.add_llm_callback(model_handler)
socket
|> assign(:async_result, AsyncResult.loading())
|> start_async(:running_llm, fn ->
case LLMChain.run(chain, mode: :while_needs_response) do
{:ok, _updated_chain} -> :ok
{:error, _updated_chain, %LangChain.LangChainError{message: reason}} -> {:error, reason}
end
end)
end
def handle_info({:chat_delta, %LangChain.MessageDelta{} = delta}, socket) do
updated_chain = LLMChain.apply_delta(socket.assigns.llm_chain, delta)
{:noreply, assign(socket, :llm_chain, updated_chain)}
end
This sounds like a bug. I don't regularly use ChatGoogleAI. Is anyone else able to jump in who is using it?
Maybe I can check
Any word on this? Still broken
I think in do_process_response
in clause # Function Call in a MessageDelta
it should be:
role: unmap_role(content_data["role"]),
content: text_content,
complete: true,
index: data["index"]
}
|> Utils.conditionally_add_to_map(:status, Map.get(data, "finishReason") |> finish_reason_to_status()) <-- THIS IS MISSING
|> Utils.conditionally_add_to_map(:tool_calls, tool_calls_from_parts)
|> MessageDelta.new()```
I need to provide more documentation for working with v0.4.0-rc.1, but changes were made to the Utils module with how streams of deltas are processed. Part of the goal was to catch the token usage metadata that's returned AFTER the deltas are completed when working with ChatOpenAI. The change takes a list of deltas and passes them through the on_llm_new_delta callback instead of firing with each and every delta. The merge process changes a bit as well, and this is probably where the new docs are needed.
But there's a better chance of getting the final "complete" status this way.
With v0.4.0-rc.1, the way deltas get merged was changed to merge them in batches, because that's how they can be received sometimes. I've tested and verified it with ChatOpenAI and ChatAnthropic.
If this is still a problem, then it's likely an issue with the specific chat model being used and PRs are welcome.