langcorn
langcorn copied to clipboard
Whether to support streaming output?
Short answer - No.
from langchain.chat_models import ChatOpenAI
from langchain.schema import (
HumanMessage,
)
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
chat = ChatOpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
resp = chat([HumanMessage(content="Write me a song about sparkling water.")])
Supporting streaming is possible, but right now, it does not fit the most common 99% of scenarios for this library. Let me know if you believe otherwise