chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

Langchain's agent does not support streaming output

Open HappyWHoo opened this issue 1 year ago • 4 comments
trafficstars

111111

the agent is a AgentExecutor,unable to implement streaming output in chainlit.

2222222

Can anyone help me?

HappyWHoo avatar Dec 14 '23 05:12 HappyWHoo

image Are you not missing an input to the callback handler?

datapay-ai avatar Jan 01 '24 12:01 datapay-ai

I have the same issue where res = await llm_math.acall(message.content, callbacks=[cl.LangchainCallbackHandler()]) is not valid anymore and deprecated. the agentExecutor in Langchain now uses ainvoke, astream etc. and the callbacks doesnt work.

Ive tried hundreds of different combinations trying to figure this out. current code I have thats not streaming:

res = await agent_executor.ainvoke({"input": message.content}, callbacks=[cl.LangchainCallbackHandler(stream_final_answer=True)])

using the model:

model = AzureChatOpenAI(
    openai_api_version="2023-12-01-preview",
    azure_deployment="gpt-4-Prev",
    azure_endpoint=AZURE_OPENAI_ENDPOINT,
    api_key=AZURE_OPENAI_API_KEY,
    temperature=0,
    streaming=True,
)

oscar-lindholm avatar Jan 23 '24 12:01 oscar-lindholm

I have the same problem, can't stream final answer with LlamaCpp.

gutmmm avatar Jan 26 '24 14:01 gutmmm

If somebody is looking for a solution to this issue, the following code is working with a LangChain OpenAI agent :

@cl.on_chat_start
async def on_chat_start():
    # Define the agent
    sellbotix = Sellbotix(model_name="gpt-4-turbo-preview", temperature=0.7)
    runnable = await sellbotix.get_runnable()
    cl.user_session.set("sellbotix", runnable)
    chat_history = []
    cl.user_session.set("chat_history", chat_history)

@cl.on_message
async def main(message: cl.Message):
    sellbotix = cl.user_session.get("sellbotix") 
    chat_history = cl.user_session.get("chat_history")
    msg = cl.Message(content="")

    async for event in sellbotix.astream_events({"input": message.content, "chat_history": chat_history}, version="v1"):
        kind = event["event"]
        if kind == "on_chat_model_stream":
            content = event['data']["chunk"].content
            if content:
                await msg.stream_token(content)
        elif kind == "on_tool_start":
            async with cl.Step(name=event['name']) as step:
                step.input = f"Starting tool: {event['name']} with inputs: {event['data'].get('input')}"
                step.output = f"Finishing tool: {event['name']} with inputs: {event['data'].get('input')}"

    chat_history.append(HumanMessage(content=message.content))
    chat_history.append(AIMessage(content=msg.content))
    cl.user_session.set("chat_history", chat_history)
    await msg.send()

Note that I don't use a callback as the builtin callback isn't working. Instead I simply use the new astream_events() method to get a stream from the chat model and push the chunks in the msg object.

jxraynaud avatar Mar 27 '24 14:03 jxraynaud

Hey @jxraynaud , I have enabled cot to full in the config.toml, but using above code, I am not able to see my function tools call by langchain in the UI. When I was using non stream syntax, just graph.invoke, from the https://langchain-ai.github.io/langgraph/tutorials/customer-support/customer-support/#state-assistant, I was able to view which function calls are being made. Can you share some details on how to enable that function calls with streaming?

ambiSk avatar Sep 03 '24 09:09 ambiSk

This is my code, it runs well

@cl.on_message
async def main(message: cl.Message):
    ......
    agent.astream(
        input={"messages": message_history},
        config=RunnableConfig(
            configurable={"thread_id": user.id},
            recursion_limit=15,
            callbacks=[
                cl.AsyncLangchainCallbackHandler(
                    stream_final_answer=True,
                    force_stream_final_answer=True,
                )
            ]
        )
    )

Valdanitooooo avatar Sep 03 '24 10:09 Valdanitooooo