langgraph
langgraph copied to clipboard
[StreamlitCallbackHandler] - Not compatible with LangGraph
Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
Example Code
import streamlit as st
from langgraph.prebuilt import create_agent_executor
from langchain.agents import AgentType, create_openai_functions_agent
from langchain.callbacks.streamlit import StreamlitCallbackHandler
from langchain_community.agent_toolkits import SQLDatabaseToolkit
from langchain_community.utilities.sql_database import SQLDatabase
from langchain_core.prompts.chat import ChatPromptTemplate, AIMessage, SystemMessage, HumanMessagePromptTemplate, MessagesPlaceholder
st_cb = StreamlitCallbackHandler(st.container())
db = SQLDatabase(engine=engine, include_tables=tables)
toolkit = SQLDatabaseToolkit(db=db, llm=llm)
sql_tools = toolkit.get_tools()
messages = [
SystemMessage(content=SQL_PREFIX),
HumanMessagePromptTemplate.from_template("{input}"),
AIMessage(content=SQL_FUNCTIONS_SUFFIX),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
input_variables = ["input", "agent_scratchpad"]
prompt = ChatPromptTemplate(input_variables=input_variables, messages=messages)
sql_agent_runnable = create_openai_functions_agent(llm, sql_tools, prompt)
result = app.invoke({"input": "What is the table about?","chat_history":[]},{"callbacks": [st_cb]})
Error Message and Stack Trace (if applicable)
2024-02-10 11:04:02.381 Thread 'ThreadPoolExecutor-15_0': missing ScriptRunContext Error in StreamlitCallbackHandler.on_llm_start callback: NoSessionContext() Error in StreamlitCallbackHandler.on_llm_end callback: RuntimeError('Current LLMThought is unexpectedly None!') Error in StreamlitCallbackHandler.on_tool_start callback: RuntimeError('Current LLMThought is unexpectedly None!') Error in StreamlitCallbackHandler.on_tool_end callback: RuntimeError('Current LLMThought is unexpectedly None!')
Description
I am trying to use the StreamlitCallbackHandler with LangGraph as I can successfully do it with LangChain. Based on my observation, the internal format drastically diverges between langchain and langGraph. Does it mean, that StreamlitCallbackHandler will not be compatible with langGraph ?
System Info
langchain==0.1.0 langchain-community==0.0.12 langchain-core==0.1.14 langchain-experimental==0.0.49 langchain-openai==0.0.3 langgraph==0.0.20
It should still be compatible - I don't see langgraph used above - could you give an example of how you're including it in the graph?
It should still be compatible - I don't see langgraph used above - could you give an example of how you're including it in the graph?
I am also unable to add and not sure where exactly it would be best to add streamlitcallback. do you already have some small example langgraph with st callback?
bump. Also running into this - doesn't seem compatible with langgraph.
# ...
graph_config = RunnableConfig()
st_callback = StreamlitCallbackHandler(answer_container)
graph_config['callbacks'] = [st_callback]
graph_config['hyperparameters'] = st.session_state.graph_hyperparameters
graph_config['metadata'] = {
"conversation_id": st.session_state.session_id
}
graph_input = {"input": st.session_state.input, "messages": st.session_state.convo_history}
graph = build_graph(use_open_routing=False)
st.json(graph_config)
async for event in graph.astream_events(
input=graph_input,
config=graph_config,
version='v1'
):
pass
running without async via graph.invoke() also fails and shows:
2024-04-26 22:22:30.405 Thread 'ThreadPoolExecutor-1_0': missing ScriptRunContext
Error in StreamlitCallbackHandler.on_llm_start callback: NoSessionContext()
and, as above, running graph.astream_events() fails with:
2024-04-26 22:04:06.701 Thread 'asyncio_2': missing ScriptRunContext
Error in StreamlitCallbackHandler.on_llm_start callback: NoSessionContext()
I'm unable to find relevant documentation to remedy. I'm also using code snippets that I know work from https://github.com/langchain-ai/streamlit-agent/blob/main/streamlit_agent/mrkl_demo.py
But, that example is using LangChain, not LangGraph - which may be the issue here.
I read that some are able to find a solution - see Streamlit issue. I'm unable to implement it correctly.
Essentially, LangGraph is creating threads which is messing with the Streamlit callback.
I have meet the similar issue when I trying to use StreamlitCallbackHandler
with steamlit and langchain
2024-07-24 18:09:36.481 Thread 'asyncio_0': missing ScriptRunContext
After deep dive, I found that the asyncio is creating threads here and the thread does't have ScriptRunContext.
Based on the solution in https://github.com/streamlit/streamlit/issues/1326, I implement MyThreadPoolExecutor extend the ThreadPoolExecutor, added add_script_run_ctx(t)
after thread creation
import weakref
from concurrent.futures import ThreadPoolExecutor
from concurrent.futures.thread import _worker
import threading
from streamlit.runtime.scriptrunner import add_script_run_ctx
_threads_queues = weakref.WeakKeyDictionary()
class MyThreadPoolExecutor(ThreadPoolExecutor):
def _adjust_thread_count(self):
# if idle threads are available, don't spin new threads
if self._idle_semaphore.acquire(timeout=0):
return
# When the executor gets lost, the weakref callback will wake up
# the worker threads.
def weakref_cb(_, q=self._work_queue):
q.put(None)
num_threads = len(self._threads)
if num_threads < self._max_workers:
thread_name = '%s_%d' % (self._thread_name_prefix or self,
num_threads)
t = threading.Thread(name=thread_name, target=_worker,
args=(weakref.ref(self, weakref_cb),
self._work_queue,
self._initializer,
self._initargs))
add_script_run_ctx(t)
t.start()
self._threads.add(t)
_threads_queues[t] = self._work_queue
and then
asyncio.get_event_loop().set_default_executor(MyThreadPoolExecutor(max_workers=2))
Finally, StreamlitCallbackHandler works fine!