[Bug]: Exceptions from Trio nursery (1 sub-exception) -- 'NoneType' object is not callable
Self Checks
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (Language Policy).
- [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- [x] Please do not modify this template :) and fill in all the required fields.
RAGFlow workspace code commit ID
baf3b9be
RAGFlow image version
v0.17.2 full
Other environment information
MacOS M4 max
Actual behavior
The light graph task is stuck, and the error is as follows: 2025-03-14 01:59:09,185 INFO 69079 set_progress(06d25baa003411f0ad676915f3ccf443), progress: -1, progress_msg: 01:59:09 [ERROR][Exception]: Exceptions from Trio nursery (1 sub-exception) -- 'NoneType' object is not callable 2025-03-14 01:59:09,185 ERROR 69079 handle_task got exception for task {"id": "06d25baa003411f0ad676915f3ccf443", "doc_id": "f4b39a06003311f0ad676915f3ccf443", "from_page": 100000000, "to_page": 100000000, "retry_count": 1, "kb_id": "253b4fc8002711f09a7fab44b9a86a2a", "parser_id": "paper", "parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "\u300a\u5143\u520a\u6742\u5267\u4e09\u5341\u79cd\u300b\u5386\u53f2\u82f1\u96c4\u60b2\u5267\u7814\u7a76\u7efc\u8ff0_\u5468\u68a6\u7f18.pdf", "type": "pdf", "location": "\u300a\u5143\u520a\u6742\u5267\u4e09\u5341\u79cd\u300b\u5386\u53f2\u82f1\u96c4\u60b2\u5267\u7814\u7a76\u7efc\u8ff0_\u5468\u68a6\u7f18.pdf", "size": 479756, "tenant_id": "f018a1eec20811efaafa6a224012c556", "language": "Chinese", "embd_id": "text-embedding-v3@Tongyi-Qianwen", "pagerank": 0, "kb_parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "img2txt_id": "qwen-vl-plus@Tongyi-Qianwen", "asr_id": "paraformer-realtime-8k-v1@Tongyi-Qianwen", "llm_id": "qwen-max@Tongyi-Qianwen", "update_time": 1741888390140, "task_type": "graphrag"} Traceback (most recent call last): File "/Users/zyy/Documents/dev_project/ragflow/rag/svr/task_executor.py", line 594, in handle_task await do_handle_task(task) File "/Users/zyy/Documents/dev_project/ragflow/rag/svr/task_executor.py", line 522, in do_handle_task await run_graphrag(task, task_language, with_resolution, with_community, chat_model, embedding_model, progress_callback) File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/index.py", line 75, in run_graphrag graph, doc_ids = await update_graph( File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/index.py", line 146, in update_graph ents, rels = await ext(doc_id, chunks, callback) File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/extractor.py", line 103, in call async with trio.open_nursery() as nursery: File "/Users/zyy/Documents/dev_project/ragflow/.venv/lib/python3.10/site-packages/trio/_core/_run.py", line 1058, in aexit raise combined_error_from_nursery exceptiongroup.ExceptionGroup: Exceptions from Trio nursery (1 sub-exception)
Expected behavior
No response
Steps to reproduce
1.Configure the document task, turn on the knowledge graph - Light, and set Entity resolution & Community reports generation
2.Start the analysis task
Additional information
It seems that the return content of LLM has been stuck. The current system model configuration is as follows
2025-03-14 02:21:44,011 INFO 73328 set_progress(8e9d4a92003711f0ad676915f3ccf443), progress: 0.5857142857142857, progress_msg: 02:21:44 Entities extraction of chunk 1 6/7 done, 21 nodes, 22 edges, 18866 tokens. 2025-03-14 02:21:55,373 INFO 73328 task_consumer_0 reported heartbeat: {"name": "task_consumer_0", "now": "2025-03-14T02:21:55.368+08:00", "boot_at": "2025-03-14T02:17:54.389+08:00", "pending": 2, "lag": 0, "done": 0, "failed": 0, "current": {"552965b6003711f0ad676915f3ccf443": {"id": "552965b6003711f0ad676915f3ccf443", "doc_id": "397b844c002911f09a7fab44b9a86a2a", "from_page": 100000000, "to_page": 100000000, "retry_count": 1, "kb_id": "253b4fc8002711f09a7fab44b9a86a2a", "parser_id": "paper", "parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "type": "pdf", "location": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "size": 1961155, "tenant_id": "f018a1eec20811efaafa6a224012c556", "language": "Chinese", "embd_id": "text-embedding-v3@Tongyi-Qianwen", "pagerank": 0, "kb_parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "img2txt_id": "qwen-vl-plus@Tongyi-Qianwen", "asr_id": "paraformer-realtime-8k-v1@Tongyi-Qianwen", "llm_id": "qwen-max@Tongyi-Qianwen", "update_time": 1741889810062, "task_type": "graphrag"}, "8e9d4a92003711f0ad676915f3ccf443": {"id": "8e9d4a92003711f0ad676915f3ccf443", "doc_id": "397b844c002911f09a7fab44b9a86a2a", "from_page": 100000000, "to_page": 100000000, "retry_count": 0, "kb_id": "253b4fc8002711f09a7fab44b9a86a2a", "parser_id": "paper", "parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "type": "pdf", "location": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "size": 1961155, "tenant_id": "f018a1eec20811efaafa6a224012c556", "language": "Chinese", "embd_id": "text-embedding-v3@Tongyi-Qianwen", "pagerank": 0, "kb_parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "img2txt_id": "qwen-vl-plus@Tongyi-Qianwen", "asr_id": "paraformer-realtime-8k-v1@Tongyi-Qianwen", "llm_id": "qwen-max@Tongyi-Qianwen", "update_time": 1741889906438, "task_type": "graphrag"}}} 2025-03-14 02:21:56,308 WARNING 73328 set_progress(552965b6003711f0ad676915f3ccf443) got exception DoesNotExist 2025-03-14 02:21:56,308 ERROR 73328 handle_task got exception for task {"id": "552965b6003711f0ad676915f3ccf443", "doc_id": "397b844c002911f09a7fab44b9a86a2a", "from_page": 100000000, "to_page": 100000000, "retry_count": 1, "kb_id": "253b4fc8002711f09a7fab44b9a86a2a", "parser_id": "paper", "parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "type": "pdf", "location": "\u300a\u6742\u5267\u4e09\u96c6\u300b\u8f91\u520a\u53ca\u7248\u672c\u6d41\u53d8\u8003\u8bba.pdf", "size": 1961155, "tenant_id": "f018a1eec20811efaafa6a224012c556", "language": "Chinese", "embd_id": "text-embedding-v3@Tongyi-Qianwen", "pagerank": 0, "kb_parser_config": {"layout_recognize": "DeepDOC", "auto_keywords": 0, "auto_questions": 0, "raptor": {"use_raptor": false}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "event", "category"], "method": "light", "resolution": true, "community": true}}, "img2txt_id": "qwen-vl-plus@Tongyi-Qianwen", "asr_id": "paraformer-realtime-8k-v1@Tongyi-Qianwen", "llm_id": "qwen-max@Tongyi-Qianwen", "update_time": 1741889810062, "task_type": "graphrag"} Traceback (most recent call last): File "/Users/zyy/Documents/dev_project/ragflow/rag/svr/task_executor.py", line 594, in handle_task await do_handle_task(task) File "/Users/zyy/Documents/dev_project/ragflow/rag/svr/task_executor.py", line 522, in do_handle_task await run_graphrag(task, task_language, with_resolution, with_community, chat_model, embedding_model, progress_callback) File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/index.py", line 75, in run_graphrag graph, doc_ids = await update_graph( File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/index.py", line 146, in update_graph ents, rels = await ext(doc_id, chunks, callback) File "/Users/zyy/Documents/dev_project/ragflow/graphrag/general/extractor.py", line 103, in call async with trio.open_nursery() as nursery: File "/Users/zyy/Documents/dev_project/ragflow/.venv/lib/python3.10/site-packages/trio/_core/_run.py", line 1058, in aexit raise combined_error_from_nursery exceptiongroup.ExceptionGroup: Exceptions from Trio nursery (2 sub-exceptions)
After trying again, it looks normal again.It's so puzzling
I also encountered the same problem.
@biofer In fact, we don't seem to have the same problem.
Uh, I'm encountering errors like Trio nursery (5 sub-exceptions) and Trio nursery (3 sub-exceptions). It seems that the error is caused by using an external API. Is it due to a timeout in the external API connection? I have configured load balancing for the API using more than 10 accounts, and I'm only analyzing one document with 9,974 characters (including Chinese characters and letters).
@TeslaZY @biofer Please paste the whole exception traceback(including the sub-exceptions) in the log file. The posted only contains the top level exception.
@yuzhichang i have same bug, log as follow:
ragflow-server | 2025-03-14 13:23:36,523 INFO 29 set_progress(901eefb8008e11f0bdb82219abc1305d), progress: -1, progress_msg: 13:23:36 [ERROR][Exception]:
Exceptions from Trio nursery (3 sub-exceptions) -- **ERROR**: Request timed out.
ragflow-server | 2025-03-14 13:23:36,524 ERROR 29 handle_task got exception for task {"id": "901eefb8008e11f0bdb82219abc1305d", "doc_id": "53edf222007611f
0a5852219abc1305d", "from_page": 100000000, "to_page": 100000000, "retry_count": 0, "kb_id": "9a7c2202ffd811ef902fa6f607d3ab1a", "parser_id": "naive", "parser
_config": {"layout_recognize": "DeepDOC", "chunk_token_num": 128, "delimiter": "##", "auto_keywords": 3, "auto_questions": 3, "html4excel": true, "raptor": {"
use_raptor": true, "prompt": "\u8bf7\u603b\u7ed3\u4ee5\u4e0b\u6bb5\u843d\u3002 \u5c0f\u5fc3\u6570\u5b57\uff0c\u4e0d\u8981\u7f16\u9020\u3002 \u6bb5\u843d\u5982\u4e0b\uff1a\n {cluster_content}\n\u4ee5\u4e0a\u5c31\u662f\u4f60\u9700\u8981\u603b\u7ed3\u7684\u5185\u5bb9\u3002", "max_token": 256, "threshold": 0.15, "
max_cluster": 64, "random_seed": 1538}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "geo", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "xxx.txt", "type": "doc", "location": "xxx.txt", "size": 27711, "tenant_id": "1a544532f24f11ef9fc3a
6f607d3ab1a", "language": "Chinese", "embd_id": "gte_Qwen2-7B-instruct___OpenAI-API@OpenAI-API-Compatible", "pagerank": 1, "kb_parser_config": {"layout_recognize": "DeepDOC", "chunk_token_num": 128, "delimiter": "##", "auto_keywords": 3, "auto_questions": 3, "html4excel": true, "raptor": {"use_raptor": true, "promp
t": "\u8bf7\u603b\u7ed3\u4ee5\u4e0b\u6bb5\u843d\u3002 \u5c0f\u5fc3\u6570\u5b57\uff0c\u4e0d\u8981\u7f16\u9020\u3002 \u6bb5\u843d\u5982\u4e0b\uff1a\n {clus
ter_content}\n\u4ee5\u4e0a\u5c31\u662f\u4f60\u9700\u8981\u603b\u7ed3\u7684\u5185\u5bb9\u3002", "max_token": 256, "threshold": 0.15, "max_cluster": 64, "random
_seed": 1538}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "geo", "event", "category"], "method": "light", "resolution": tru
e, "community": true}}, "img2txt_id": "", "asr_id": "", "llm_id": "Qwen2.5-7B-Instruct___OpenAI-API@OpenAI-API-Compatible", "update_time": 1741927275182, "tas
k_type": "graphrag"}
ragflow-server | + Exception Group Traceback (most recent call last):
ragflow-server | | File "/ragflow/rag/svr/task_executor.py", line 594, in handle_task
ragflow-server | | await do_handle_task(task)
ragflow-server | | File "/ragflow/rag/svr/task_executor.py", line 522, in do_handle_task ragflow-server | | await run_graphrag(task, task_language, with_resolution, with_community, chat_model, embedding_model, progress_callback)
ragflow-server | | File "/ragflow/graphrag/general/index.py", line 94, in run_graphrag
ragflow-server | | await resolve_entities( ragflow-server | | File "/ragflow/graphrag/general/index.py", line 231, in resolve_entities
ragflow-server | | reso = await er(graph, callback=callback) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 101, in __call__
ragflow-server | | async with trio.open_nursery() as nursery: ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_core/_run.py", line 1058, in __aexit__
ragflow-server | | raise combined_error_from_nursery ragflow-server | | exceptiongroup.ExceptionGroup: Exceptions from Trio nursery (3 sub-exceptions)
ragflow-server | +-+---------------- 1 ---------------- ragflow-server | | Traceback (most recent call last):
ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf))
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync ragflow-server | | return msg_from_thread.unwrap()
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result
ragflow-server | | return result.unwrap()
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap
ragflow-server | | raise captured_error
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn
ragflow-server | | ret = context.run(sync_fn, *args) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda>
ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat
ragflow-server | | raise Exception(response)
ragflow-server | | Exception: **ERROR**: Request timed out.
ragflow-server | +---------------- 2 ----------------
ragflow-server | | Traceback (most recent call last):
ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate
ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf))
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync
ragflow-server | | return msg_from_thread.unwrap()
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap
ragflow-server | | raise captured_error
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result
ragflow-server | | return result.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap
ragflow-server | | raise captured_error
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn ragflow-server | | ret = context.run(sync_fn, *args)
ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda> ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf))
ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat ragflow-server | | raise Exception(response)
ragflow-server | | Exception: **ERROR**: Request timed out. ragflow-server | +---------------- 3 ----------------
ragflow-server | | Traceback (most recent call last): ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate
ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync
ragflow-server | | return msg_from_thread.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap
ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result
ragflow-server | | return result.unwrap()
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap
ragflow-server | | raise captured_error
ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn
ragflow-server | | ret = context.run(sync_fn, *args)
ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda> ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf))
ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat
ragflow-server | | raise Exception(response)
ragflow-server | | Exception: **ERROR**: Request timed out.
ragflow-server | +------------------------------------
how to config timeout?
Sorry, I used to start and debug the local source code. Now I have found the log. If it is reproduced again, I will upload it.
I'm using Gemma 3 12B and got the same error as
@yuzhichang i have same bug, log as follow:
ragflow-server | 2025-03-14 13:23:36,523 INFO 29 set_progress(901eefb8008e11f0bdb82219abc1305d), progress: -1, progress_msg: 13:23:36 [ERROR][Exception]: Exceptions from Trio nursery (3 sub-exceptions) -- **ERROR**: Request timed out. ragflow-server | 2025-03-14 13:23:36,524 ERROR 29 handle_task got exception for task {"id": "901eefb8008e11f0bdb82219abc1305d", "doc_id": "53edf222007611f 0a5852219abc1305d", "from_page": 100000000, "to_page": 100000000, "retry_count": 0, "kb_id": "9a7c2202ffd811ef902fa6f607d3ab1a", "parser_id": "naive", "parser _config": {"layout_recognize": "DeepDOC", "chunk_token_num": 128, "delimiter": "##", "auto_keywords": 3, "auto_questions": 3, "html4excel": true, "raptor": {" use_raptor": true, "prompt": "\u8bf7\u603b\u7ed3\u4ee5\u4e0b\u6bb5\u843d\u3002 \u5c0f\u5fc3\u6570\u5b57\uff0c\u4e0d\u8981\u7f16\u9020\u3002 \u6bb5\u843d\u5982\u4e0b\uff1a\n {cluster_content}\n\u4ee5\u4e0a\u5c31\u662f\u4f60\u9700\u8981\u603b\u7ed3\u7684\u5185\u5bb9\u3002", "max_token": 256, "threshold": 0.15, " max_cluster": 64, "random_seed": 1538}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "geo", "event", "category"], "method": "light", "resolution": true, "community": true}}, "name": "xxx.txt", "type": "doc", "location": "xxx.txt", "size": 27711, "tenant_id": "1a544532f24f11ef9fc3a 6f607d3ab1a", "language": "Chinese", "embd_id": "gte_Qwen2-7B-instruct___OpenAI-API@OpenAI-API-Compatible", "pagerank": 1, "kb_parser_config": {"layout_recognize": "DeepDOC", "chunk_token_num": 128, "delimiter": "##", "auto_keywords": 3, "auto_questions": 3, "html4excel": true, "raptor": {"use_raptor": true, "promp t": "\u8bf7\u603b\u7ed3\u4ee5\u4e0b\u6bb5\u843d\u3002 \u5c0f\u5fc3\u6570\u5b57\uff0c\u4e0d\u8981\u7f16\u9020\u3002 \u6bb5\u843d\u5982\u4e0b\uff1a\n {clus ter_content}\n\u4ee5\u4e0a\u5c31\u662f\u4f60\u9700\u8981\u603b\u7ed3\u7684\u5185\u5bb9\u3002", "max_token": 256, "threshold": 0.15, "max_cluster": 64, "random _seed": 1538}, "graphrag": {"use_graphrag": true, "entity_types": ["organization", "person", "geo", "event", "category"], "method": "light", "resolution": tru e, "community": true}}, "img2txt_id": "", "asr_id": "", "llm_id": "Qwen2.5-7B-Instruct___OpenAI-API@OpenAI-API-Compatible", "update_time": 1741927275182, "tas k_type": "graphrag"} ragflow-server | + Exception Group Traceback (most recent call last): ragflow-server | | File "/ragflow/rag/svr/task_executor.py", line 594, in handle_task ragflow-server | | await do_handle_task(task) ragflow-server | | File "/ragflow/rag/svr/task_executor.py", line 522, in do_handle_task ragflow-server | | await run_graphrag(task, task_language, with_resolution, with_community, chat_model, embedding_model, progress_callback) ragflow-server | | File "/ragflow/graphrag/general/index.py", line 94, in run_graphrag ragflow-server | | await resolve_entities( ragflow-server | | File "/ragflow/graphrag/general/index.py", line 231, in resolve_entities ragflow-server | | reso = await er(graph, callback=callback) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 101, in __call__ ragflow-server | | async with trio.open_nursery() as nursery: ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_core/_run.py", line 1058, in __aexit__ ragflow-server | | raise combined_error_from_nursery ragflow-server | | exceptiongroup.ExceptionGroup: Exceptions from Trio nursery (3 sub-exceptions) ragflow-server | +-+---------------- 1 ---------------- ragflow-server | | Traceback (most recent call last): ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync ragflow-server | | return msg_from_thread.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result ragflow-server | | return result.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn ragflow-server | | ret = context.run(sync_fn, *args) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda> ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat ragflow-server | | raise Exception(response) ragflow-server | | Exception: **ERROR**: Request timed out. ragflow-server | +---------------- 2 ---------------- ragflow-server | | Traceback (most recent call last): ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync ragflow-server | | return msg_from_thread.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result ragflow-server | | return result.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn ragflow-server | | ret = context.run(sync_fn, *args) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda> ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat ragflow-server | | raise Exception(response) ragflow-server | | Exception: **ERROR**: Request timed out. ragflow-server | +---------------- 3 ---------------- ragflow-server | | Traceback (most recent call last): ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in _resolve_candidate ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 447, in to_thread_run_sync ragflow-server | | return msg_from_thread.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 373, in do_release_then_return_result ragflow-server | | return result.unwrap() ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/outcome/_impl.py", line 213, in unwrap ragflow-server | | raise captured_error ragflow-server | | File "/ragflow/.venv/lib/python3.10/site-packages/trio/_threads.py", line 392, in worker_fn ragflow-server | | ret = context.run(sync_fn, *args) ragflow-server | | File "/ragflow/graphrag/entity_resolution.py", line 176, in <lambda> ragflow-server | | response = await trio.to_thread.run_sync(lambda: self._chat(text, [{"role": "user", "content": "Output:"}], gen_conf)) ragflow-server | | File "/ragflow/graphrag/general/extractor.py", line 66, in _chat ragflow-server | | raise Exception(response) ragflow-server | | Exception: **ERROR**: Request timed out. ragflow-server | +------------------------------------how to config timeout?
I got the same Error using Gemma 3 14B. Any update on this issue? @yuzhichang
@stevenguan08 The error log indicates that the LLM didn't reply in time. It's likely that the LLM service is overloaded. You can decrease MAX_CONCURRENT_CHATS (default 10) to relieve.
I believe this fix has been mentioned previously and it didn't work. I will try it once my parsing is done.
[Question]: The Extract knowledge graph General and Light Problem #5731
@stevenguan08 Please track your problem with another issue. Keep one Github issue track only one particular problem. If your problem is the same with an existing one, just thumb up on original comment instead of commenting "me too".
@yuzhichang I use local Ollama. I tried to reduce the MAX_CONCURRENT_CHATS parameter to 6, the same error occurs. If I reduce the value even more, the process freezes, the progress is less than 1%.
@vab072 Your issue may have been fixed by #6340