langserve icon indicating copy to clipboard operation
langserve copied to clipboard

Setting max_execution_time of Agent does not work for /stream

Open fatimatayeb opened this issue 1 year ago • 1 comments
trafficstars

Based on https://python.langchain.com/docs/modules/agents/how_to/max_time_limit we can set a timeout for agents and a message will be returned to the user, this works in LangChain.

However when using this feature with LangServe specifically with the /stream endpoint it fails with an error (see Trace below).

I tried using the @app.exception_handler(Exception) from FastAPI and it works with /invoke endpoint but not with /stream.

How can we set up a timeout for the /stream?

Trace:

[chain/error] [1:chain:AgentExecutor] [1.00s] Chain run errored with error:
"TimeoutError()"
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/fastapi/applications.py", line 1115, in __call__
    await super().__call__(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/starlette/routing.py", line 69, in app
    await response(scope, receive, send)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/sse_starlette/sse.py", line 233, in __call__
    async with anyio.create_task_group() as task_group:
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/sse_starlette/sse.py", line 236, in wrap
    await func()
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/sse_starlette/sse.py", line 221, in stream_response
    async for data in self.body_iterator:
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langserve/server.py", line 797, in _stream
    async for chunk in runnable.astream(
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/schema/runnable/base.py", line 2845, in astream
    async for item in self.bound.astream(
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/schema/runnable/base.py", line 501, in astream
    yield await self.ainvoke(input, config, **kwargs)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/chains/base.py", line 103, in ainvoke
    return await self.acall(
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/chains/base.py", line 379, in acall
    raise e
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/chains/base.py", line 373, in acall
    await self._acall(inputs, run_manager=run_manager)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/langchain/agents/agent.py", line 1213, in _acall
    async with asyncio_timeout(self.max_execution_time):
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/async_timeout/__init__.py", line 141, in __aexit__
    self._do_exit(exc_type)
  File "/home/llm/anaconda3/envs/fleet/lib/python3.10/site-packages/async_timeout/__init__.py", line 228, in _do_exit
    raise asyncio.TimeoutError
asyncio.exceptions.TimeoutError


fatimatayeb avatar Dec 28 '23 07:12 fatimatayeb

@fatimatayeb I am unable to reproduce this. The agent successfully stops for me in both stream and stream_log endpoints.

I used this agent:

https://github.com/langchain-ai/langserve/blob/3c94bf39b498e5f91d354c4c438676bafcb46baa/examples/agent/server.py#L56-L56

And modified:

agent_executor = AgentExecutor(agent=agent, tools=tools, max_execution_time=0.1)

or

agent_executor = AgentExecutor(agent=agent, tools=tools, max_execution_time=3)

And also added delay to the tool

@tool
async def get_eugene_thoughts(query: str) -> list:
    """Returns Eugene's thoughts on a topic."""
    import asyncio
    await asyncio.sleep(10)
    return retriever.get_relevant_documents(query)

Output that I am seeing:

image

eyurtsev avatar Jan 05 '24 17:01 eyurtsev