chainlit icon indicating copy to clipboard operation
chainlit copied to clipboard

Streaming=True not working when i integrate Langchain.

Open altafr opened this issue 1 year ago • 2 comments

import os from langchain import PromptTemplate, OpenAI, LLMChain import chainlit as cl

#os.environ["OPENAI_API_KEY"] = "YOUR_OPEN_AI_API_KEY"

template = """Question: {question}

Answer: Let's think step by step."""

llm=OpenAI(temperature=0,streaming=True)

@cl.langchain_factory def factory():

prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt, llm=llm , verbose=True)

return llm_chain

altafr avatar Jun 04 '23 08:06 altafr

At the moment the intermediate steps should be streamed and not the final response. Is that the case or even the intermediate steps are not being streamed? You can see https://github.com/Chainlit/chainlit/issues/7#issuecomment-1569969095 for context

willydouhard avatar Jun 04 '23 09:06 willydouhard

import os from langchain import PromptTemplate, OpenAI, LLMChain import chainlit as cl

#os.environ["OPENAI_API_KEY"] = "YOUR_OPEN_AI_API_KEY"

template = """Question: {question}

Answer: Let's think step by step."""

llm=OpenAI(temperature=0,streaming=True)

@cl.langchain_factory def factory():

prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt, llm=llm , verbose=True)

return llm_chain

Since you are using an LLMChain the streamed content will be inside the working box that appears. If you ask it to tell you a 200 word story and then expand the box. You will see that the content within is streaming.

Please confirm if this is working for you so we can close this issue.

If instead you want to stream the content directly. Below is a simple example of how to feed a message directly back to the UI using streaming.

import chainlit as cl

from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage


llm = ChatOpenAI(streaming=True)

@cl.on_message
def main(message: str):
    cl.Message(content="") # Need to set the Message content to be blank before trigging the llm.
    llm([HumanMessage(content=message)])

The above work around using cl.Message(content="") is needed and will be fixed in a future release. It is required as of version 0.2.109

gruckion avatar Jun 06 '23 12:06 gruckion

thanks

altafr avatar Jun 11 '23 12:06 altafr