chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

LangchainTracer throwing ValueErrors

Open maciejwie opened this issue 1 year ago • 3 comments
trafficstars

Describe the bug A regression was introduced in 1.1.400 (works ok in 1.1.306) which fails to process all Langchain chains correctly and throws errors such as:

Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')
Error in LangchainTracer.on_chain_end callback: TypeError('cannot unpack non-iterable NoneType object')
Error in LangchainTracer.on_retriever_end callback: ValueError('too many values to unpack (expected 2)')
Error in LangchainTracer.on_chain_end callback: ValueError('too many values to unpack (expected 2)')

After adding some instrumentation in the callback, my project shows:

current_step <chainlit.step.Step object at 0x16a3cddf0>
run.outputs {'output': set()}
outputs {'output': set()}
output_keys ['output']
output set()
current_step <chainlit.step.Step object at 0x16a3cddf0>
2024-07-29 15:46:47 - Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')
current_step <chainlit.step.Step object at 0x16a410610>
run.outputs {'output': None}
outputs {'output': None}
output_keys ['output']
output None
current_step <chainlit.step.Step object at 0x16a410610>
2024-07-29 15:46:47 - Error in LangchainTracer.on_chain_end callback: TypeError('cannot unpack non-iterable NoneType object')
current_step <chainlit.step.Step object at 0x16a3cd880>
run.outputs {'documents': [Document(real Document data)]}
outputs {'documents': [Document(real Document data)]}
output_keys ['documents']
output [Document(real Document data)]}
current_step <chainlit.step.Step object at 0x16a3cd880>
2024-07-29 15:46:49 - Error in LangchainTracer.on_retriever_end callback: ValueError('too many values to unpack (expected 2)')
current_step <chainlit.step.Step object at 0x16a3ffee0>
run.outputs {'chat_history': []}
outputs {'chat_history': []}
output_keys ['chat_history']
output []
current_step <chainlit.step.Step object at 0x16a3ffee0>
2024-07-29 15:46:49 - Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')

To Reproduce Steps to reproduce the behavior:

  1. Run this code:
from __future__ import annotations

from operator import itemgetter
from typing import Optional

import chainlit as cl
from langchain_core.prompts import (
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
    MessagesPlaceholder,
    SystemMessagePromptTemplate,
)
from langchain_core.runnables import Runnable, RunnableLambda, RunnablePassthrough
from langchain.memory import ConversationBufferWindowMemory
from langchain.schema.runnable.config import RunnableConfig
from langchain.schema import StrOutputParser
from langchain_openai import ChatOpenAI


@cl.on_chat_start
async def start():
    llm = ChatOpenAI()
    memory = ConversationBufferWindowMemory(
        memory_key="chat_history",
        return_messages=True,
        k=5,
    )
    # initialize empty memroy
    memory.load_memory_variables({})

    template = "You are a helpful chatbot."

    prompt = ChatPromptTemplate.from_messages(
        [
            SystemMessagePromptTemplate.from_template(template),
            MessagesPlaceholder(variable_name="chat_history"),
            HumanMessagePromptTemplate.from_template("{question}"),
        ]
    )

    runnable = RunnablePassthrough.assign(
        chat_history=RunnableLambda(memory.load_memory_variables) | itemgetter("chat_history")
    ).assign(output=prompt | llm | StrOutputParser())

    # Store session data
    cl.user_session.set("runnable", runnable)
    cl.user_session.set("memory", memory)

@cl.on_message
async def run(message: cl.Message):
    # Retrieve session data
    runnable: Optional[Runnable] = cl.user_session.get("runnable")

    # Create message object for the response
    msg = cl.Message(content="")

    # Generate the response, and stream the output
    async for chunk in runnable.astream(
        {"question": message.content},
        config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]),
    ):
        # Update the message with the response
        if chunk.get("output"):
            await msg.stream_token(chunk.get("output"))

    # Finalize message and update UI
    await msg.send()
  1. Send any message
  2. Look at logs for string Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')

Expected behavior LangchainTracer to not throw ValueErrors.

Desktop (please complete the following information):

  • OS: MacOS
  • Langchain version: tried both 0.2.7 and 0.2.11 (latest)

maciejwie avatar Jul 29 '24 20:07 maciejwie