deer-flow icon indicating copy to clipboard operation
deer-flow copied to clipboard

[BUG] : 发送`生成报告`请求后出现Tavily Search错误

Open emocat17 opened this issue 5 months ago • 5 comments

输入prompt,点击调研后,UI报错:

Image

console next-server如下:

> [email protected] dev C:\GitWorks\deer-flow\web
> dotenv -e ../.env -- next dev --turbo

   ▲ Next.js 15.3.0 (Turbopack)
   - Local:        http://localhost:3000
   - Network:      http://192.168.1.107:3000

 ✓ Starting...
Creating turbopack project { dir: 'C:\\GitWorks\\deer-flow\\web', testMode: true }
 ✓ Ready in 3.5s
 ○ Compiling / ...
 ✓ Compiled / in 4.7s
 GET / 200 in 6367ms
 ○ Compiling /chat ...
 ✓ Compiled /chat in 4s
 GET /chat 200 in 4138ms

uv.exe如下:

2025-07-23 16:45:45,068 - __main__ - INFO - Starting DeerFlow API server on localhost:8000
INFO:     Will watch for changes in these directories: ['C:\\GitWorks\\deer-flow']
INFO:     Uvicorn running on http://localhost:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [10768] using StatReload
2025-07-23 16:45:52,982 - src.server.app - INFO - Allowed origins: ['http://localhost:3000']
INFO:     Started server process [17632]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:63049 - "GET /api/config HTTP/1.1" 200 OK
INFO:     127.0.0.1:63049 - "GET /api/config HTTP/1.1" 200 OK
INFO:     127.0.0.1:63173 - "OPTIONS /api/prompt/enhance HTTP/1.1" 200 OK
2025-07-23 16:48:01,512 - src.server.app - INFO - Enhancing prompt: 撰写XXXXX报告时间限定在2025年之后
2025-07-23 16:48:01,523 - src.prompt_enhancer.graph.enhancer_node - INFO - Enhancing user prompt...
2025-07-23 16:48:20,691 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-07-23 16:48:20,719 - src.prompt_enhancer.graph.enhancer_node - INFO - Prompt enhancement completed successfully
INFO:     127.0.0.1:63173 - "POST /api/prompt/enhance HTTP/1.1" 200 OK
INFO:     127.0.0.1:63174 - "OPTIONS /api/chat/stream HTTP/1.1" 200 OK
INFO:     127.0.0.1:63174 - "POST /api/chat/stream HTTP/1.1" 200 OK
2025-07-23 16:48:26,731 - src.config.configuration - INFO - Recursion limit set to: 30
2025-07-23 16:48:26,735 - src.graph.nodes - INFO - Coordinator talking.
2025-07-23 16:48:27,916 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-07-23 16:48:34,863 - src.graph.nodes - INFO - background investigation node is running.
2025-07-23 16:48:35,866 - src.graph.nodes - ERROR - Tavily search returned malformed response: HTTPError('400 Client Error: Bad Request for url: https://api.tavily.com/search')
2025-07-23 16:48:35,868 - src.graph.nodes - INFO - Planner generating full plan
2025-07-23 16:48:37,673 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-07-23 16:48:54,312 - src.graph.nodes - INFO - Planner response: {
  "locale": "zh-CN",
  "has_enough_context": false,
  "thought": "用户要求撰写一份题为《XXXXXXXXXXXX》的深度研究报告,XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX。",
  "title": "XXXXX研究计划",
  "steps": [
    {
      "need_search": true,
      "title": "XXXXXXXXXXXXXXXXXX",
      "description": "XXXXXXXXXXXXXXXX",
      "step_type": "research"
    },
    {
      "need_search": true,
      "title": "XXXXXXXXXXXXXXX",
      "description": "XXXXXXXXXXXXXXXXXXXX",
      "step_type": "research"
    },
    {
      "need_search": true,
      "title": "XXXXXXXXXXXXXXX",
      "description": "XXXXXXXXXXXXXXXX",
      "step_type": "research"
    },
    {
      "need_search": true,
      "title": "XXXXXXXXXXXX",
      "description": "XXXXXXXXXXXXXXXXXX",
      "step_type": "research"
    }
  ]
}
2025-07-23 16:48:54,322 - src.graph.nodes - INFO - Research team is collaborating on tasks.
2025-07-23 16:48:54,337 - src.graph.nodes - INFO - Researcher node is researching.
2025-07-23 16:48:54,344 - src.tools.search - INFO - Tavily search configuration loaded: include_domains=[], exclude_domains=[]
2025-07-23 16:48:54,345 - src.graph.nodes - INFO - Researcher tools: [LoggedTavilySearchResultsWithImages(name='web_search', include_raw_content=True, include_images=True, api_wrapper=EnhancedTavilySearchAPIWrapper(tavily_api_key=SecretStr('**********')), include_image_descriptions=True), StructuredTool(name='crawl_tool', description='Use this to crawl a url and get a readable content in markdown format.', args_schema=<class 'langchain_core.utils.pydantic.crawl_tool'>, func=<function crawl_tool at 0x000001FEE8DA1DA0>)]
2025-07-23 16:48:54,368 - src.graph.nodes - INFO - Executing step: XXXXXXXXXX, agent: researcher
2025-07-23 16:48:54,368 - src.graph.nodes - INFO - Recursion limit set to: 30
2025-07-23 16:48:54,369 - src.graph.nodes - INFO - Agent input: {'messages': [HumanMessage(content='# Research Topic\n\nXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX\n\n## Locale\n\nzh-CN', additional_kwargs={}, response_metadata={}), HumanMessage(content='IMPORTANT: DO NOT include inline citations in the text. Instead, track all sources and include a References section at the end using link reference format. Include an empty line between each citation for better readability. Use this format for each reference:\n- [Source Title](URL)\n\n- [Another Source](URL)', additional_kwargs={}, response_metadata={}, name='system')]}
2025-07-23 16:48:56,373 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-07-23 16:49:11,696 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 400 Bad Request"
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 403, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\applications.py", line 112, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\middleware\errors.py", line 187, in __call__
    raise exc
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\middleware\errors.py", line 165, in __call__
    await self.app(scope, receive, _send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\middleware\cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\middleware\cors.py", line 144, in simple_response
    await self.app(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\routing.py", line 714, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\routing.py", line 734, in app
    await route.handle(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\routing.py", line 76, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\routing.py", line 74, in app
    await response(scope, receive, send)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\responses.py", line 262, in __call__
    with collapse_excgroups():
         ^^^^^^^^^^^^^^^^^^^^
  File "E:\Anaconda\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\_utils.py", line 82, in collapse_excgroups
    raise exc
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\responses.py", line 266, in wrap
    await func()
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\starlette\responses.py", line 246, in stream_response
    async for chunk in self.body_iterator:
  File "C:\GitWorks\deer-flow\src\server\app.py", line 143, in _astream_workflow_generator
    async for agent, _, event_data in graph.astream(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\__init__.py", line 2759, in astream
    async for _ in runner.atick(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\runner.py", line 392, in atick
    _panic_or_proceed(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\runner.py", line 499, in _panic_or_proceed
    raise exc
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\retry.py", line 128, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\utils\runnable.py", line 672, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\utils\runnable.py", line 440, in ainvoke
    ret = await self.afunc(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\src\graph\nodes.py", line 488, in researcher_node
    return await _setup_and_execute_agent_step(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\src\graph\nodes.py", line 474, in _setup_and_execute_agent_step
    return await _execute_agent_step(state, agent, agent_type)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\src\graph\nodes.py", line 392, in _execute_agent_step
    result = await agent.ainvoke(
             ^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\__init__.py", line 2892, in ainvoke
    async for chunk in self.astream(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\__init__.py", line 2759, in astream
    async for _ in runner.atick(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\runner.py", line 283, in atick
    await arun_with_retry(
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\pregel\retry.py", line 128, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\utils\runnable.py", line 672, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\utils\runnable.py", line 431, in ainvoke
    ret = await asyncio.create_task(coro, context=context)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langgraph\prebuilt\chat_agent_executor.py", line 763, in acall_model
    response = cast(AIMessage, await model_runnable.ainvoke(state, config))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 3089, in ainvoke
    input_ = await coro_with_context(part(), context, create_task=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 5444, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 394, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 968, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 926, in agenerate
    raise exceptions[0]
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1083, in _agenerate_with_cache
    async for chunk in self._astream(messages, stop=stop, **kwargs):
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_openai\chat_models\base.py", line 2493, in _astream
    async for chunk in super()._astream(*args, **kwargs):
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\langchain_openai\chat_models\base.py", line 1111, in _astream
    response = await self.async_client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\openai\resources\chat\completions\completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\openai\_base_client.py", line 1748, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\GitWorks\deer-flow\.venv\Lib\site-packages\openai\_base_client.py", line 1555, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': 'Input error. Field required: input.messages.3.content', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-205b18d2-918e-9fdb-a7fa-5f43c85325eb', 'request_id': '205b18d2-918e-9fdb-a7fa-5f43c85325eb'}
During task with name 'agent' and id '2cdbce90-7858-6280-2add-081004ebbdfc'
During task with name 'researcher' and id '89dc939d-dec0-77f7-1232-ec57ccee3119'

可以看下报错信息,帮助我解决这个问题吗?非常感谢!!!

emocat17 avatar Jul 23 '25 08:07 emocat17

使用docker部署也是发生相同报错

emocat17 avatar Jul 23 '25 10:07 emocat17

使用docker部署也是发生相同报错

你看一下你的docker在报这个错的时候有没有自动重启的日志记录?我今天手动关闭的UI页面,也是报ERROR: Exception in ASGI application,所以我推测应该是异常中断导致的错误。我咩有仔细看代码,说实话我可能也不太能看得懂,不过docker是经常容易因为程序运行异常自动重启的。

Jessy-yu avatar Jul 23 '25 11:07 Jessy-yu

我这边感觉不太像是异常中断,应该是Tavily没有正确返回内容导致结构化定义的Tavily返回消息缺失:

ERROR - Tavily search returned malformed response: HTTPError('400 Client Error: Bad Request for url: https://api.tavily.com/search')
2025-07-23 16:48:35,868 - src.graph.nodes - INFO - Planner generating full plan

and

2025-07-23 16:48:54,368 - src.graph.nodes - INFO - Executing step: XXXXXXXXXX, agent: researcher
2025-07-23 16:48:54,368 - src.graph.nodes - INFO - Recursion limit set to: 30
2025-07-23 16:48:54,369 - src.graph.nodes - INFO - Agent input: {'messages': [HumanMessage(content='# Research Topic\n\nXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX\n\n## Locale\n\nzh-CN', additional_kwargs={}, response_metadata={}), HumanMessage(content='IMPORTANT: DO NOT include inline citations in the text. Instead, track all sources and include a References section at the end using link reference format. Include an empty line between each citation for better readability. Use this format for each reference:\n- [Source Title](URL)\n\n- [Another Source](URL)', additional_kwargs={}, response_metadata={}, name='system')]}
2025-07-23 16:48:56,373 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-07-23 16:49:11,696 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 400 Bad Request"

但是在webui右侧已经搜索出结果:

Image

一直卡在如图所示的界面;

一个月之前的版本还可以运行,拉取了最新的代码之后就跑不了了

TAVILY_API_KEY也已正确配置:

Image

emocat17 avatar Jul 23 '25 13:07 emocat17

src.graph.nodes - ERROR - Tavily search returned malformed response: HTTPError('400 Client Error: Bad Request for url: https://api.tavily.com/search')⁠

emocat17 avatar Aug 01 '25 13:08 emocat17

即使Tavily无问题,依旧报错:

2025-08-03 18:01:03,963 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 200 OK"
2025-08-03 18:01:19,530 - httpx - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 400 Bad Request"
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/applications.py", line 112, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
    raise exc
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
    await self.app(scope, receive, _send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 144, in simple_response
    await self.app(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/routing.py", line 714, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in app
    await route.handle(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/routing.py", line 74, in app
    await response(scope, receive, send)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/responses.py", line 262, in __call__
    with collapse_excgroups():
  File "/root/anaconda3/lib/python3.12/contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
    raise exc
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/responses.py", line 266, in wrap
    await func()
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/starlette/responses.py", line 246, in stream_response
    async for chunk in self.body_iterator:
  File "/home/deer-flow/src/server/app.py", line 143, in _astream_workflow_generator
    async for agent, _, event_data in graph.astream(
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2759, in astream
    async for _ in runner.atick(
  File "/home/deer-flow/src/graph/nodes.py", line 488, in researcher_node
    return await _setup_and_execute_agent_step(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/src/graph/nodes.py", line 474, in _setup_and_execute_agent_step
    return await _execute_agent_step(state, agent, agent_type)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/src/graph/nodes.py", line 392, in _execute_agent_step
    result = await agent.ainvoke(
             ^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2892, in ainvoke
    async for chunk in self.astream(
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2759, in astream
    async for _ in runner.atick(
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langgraph/prebuilt/chat_agent_executor.py", line 763, in acall_model
    response = cast(AIMessage, await model_runnable.ainvoke(state, config))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3089, in ainvoke
    input_ = await coro_with_context(part(), context, create_task=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 5444, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 394, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 968, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 926, in agenerate
    raise exceptions[0]
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1083, in _agenerate_with_cache
    async for chunk in self._astream(messages, stop=stop, **kwargs):
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 2493, in _astream
    async for chunk in super()._astream(*args, **kwargs):
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1111, in _astream
    response = await self.async_client.create(**payload)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1748, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/deer-flow/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1555, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: Range of input length should be [1, 129024]', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-fba3cc95-cf21-9c4b-a468-84c11e739e29', 'request_id': 'fba3cc95-cf21-9c4b-a468-84c11e739e29'}
During task with name 'agent' and id '3cbeca8c-a8b3-61b1-b1d1-611c4d1ca7ab'
During task with name 'researcher' and id 'd09c6dec-3868-d09a-0c60-4fb44a878976'

emocat17 avatar Aug 03 '25 10:08 emocat17