eigent icon indicating copy to clipboard operation
eigent copied to clipboard

[BUG] unbearable speed

Open HeavySnowJakarta opened this issue 2 months ago • 11 comments

What version of eigent are you using?

0.0.71

System information

Windows 11 with the pre-built app (not compiling myself as the process fails)

all environments built-in

Problem description

Please forgive my possible emotional words. Here is my prompts:

Produce a minimal report that consists only of the title LLM works properly. Deliverable: return a single plain-text line containing exactly LLM works properly with no additional text, no quotation marks, no leading/trailing spaces, and no extra lines.

That kind of work usually takes seconds for a single chat window of an llm. It's not hard to understand that such a simple task may takes little more time for a stronger multi-agent system but I think the costs and the speed of the project on my computer seems to be really unbearable. I'm not requiring it to do very web searches or call MCP servers. It's just expected to do the simplest reasoning and then output the report. That's all it have to do. Now it have worked for more than 11 minutes spending over 19k tokens, even left no content on the folder, which has been expected to give me a shortest report in markdown or anything else.

Also it has shown almost no information during the work, except only some important steps. From the log file I can see it was downloading something on my computer and some errors occurred, but I know almost nothing during the generation. Anyway, the speed of tech work is unexpected slow, with even no result left. All I can do was just to wait, begging for it to show me even a smallest dialog to let me know the current status and process, but for a very long time the only thing I can see after clicking the button of "start work" were only a few circles of loading.

I'm sorry, but I have to say it seems to be really terrible for me, probably cuz there's something wrong with my environment. The speed seems to be really slow, not only because it's made on Electron, but also because it has done a lot during the generation, reasonably or non-reasonably.

log

[info]  start check version { currentVersion: '0.0.71' }
[info]  version file not exist, will create new file
[info]  version changed, prepare to reinstall uv dependencies... {
  currentVersion: '0.0.71',
  savedVersion: 'none',
  reason: 'version file not exist'
}
[info]  version file updated { currentVersion: '0.0.71' }
[info] Running script at: C:\Users\bians\AppData\Local\Programs\Eigent\resources\app.asar\resources\scripts\install-uv.js
[info] Script output: Using uv version: 0.6.14

[info] Script output: Installing uv 0.6.14 for win32-x64...
[START] downloadUvBinary: https://github.com/astral-sh/uv/releases/download
Downloading uv 0.6.14 for win32-x64...
URL: https://github.com/astral-sh/uv/releases/download/0.6.14/uv-x86_64-pc-windows-msvc.zip

[error] Script error: Error installing uv for win32-x64: read ECONNRESET

[info] Script output: Removed empty directory: C:\Users\bians\.eigent\bin

[info] Script output: Downloading uv from gitcode.com
[START] downloadUvBinary: https://gitcode.com/CherryHQ/uv/releases/download

[info] Script output: Downloading uv 0.6.14 for win32-x64...
URL: https://gitcode.com/CherryHQ/uv/releases/download/0.6.14/uv-x86_64-pc-windows-msvc.zip

[info] Script output: Extracting uv-x86_64-pc-windows-msvc.zip to C:\Users\bians\.eigent\bin...

[info] Script output: Successfully installed uv 0.6.14 for win32-x64

[info] Script output: Downloading uv from gitcode.com #### true
Installation successful

[info] Script completed successfully
[info] Running script at: C:\Users\bians\AppData\Local\Programs\Eigent\resources\app.asar\resources\scripts\install-bun.js
[info] Script output: Using bun version: 1.2.9

[info] Script output: Installing bun 1.2.9 for win32-x64 (baseline)...
Downloading bun 1.2.9 for win32-x64-baseline...
URL: https://github.com/oven-sh/bun/releases/download/bun-v1.2.9/bun-windows-x64-baseline.zip

[info] second-instance [
  'C:\\Users\\bians\\AppData\\Local\\Programs\\Eigent\\Eigent.exe',
  '--allow-file-access-from-files',
  'eigent://callback/?code=efd97f34847e016ad3c8684bc7d56d68dfd949c81991bd9fbd0c36c1f2f1ce4e'
]
[info] enter handleProtocolUrl eigent://callback/?code=efd97f34847e016ad3c8684bc7d56d68dfd949c81991bd9fbd0c36c1f2f1ce4e
[info] urlObj eigent://callback/?code=efd97f34847e016ad3c8684bc7d56d68dfd949c81991bd9fbd0c36c1f2f1ce4e
[info] code efd97f34847e016ad3c8684bc7d56d68dfd949c81991bd9fbd0c36c1f2f1ce4e
[info] share_token null
[info] urlObj.pathname /
[error] protocol code: efd97f34847e016ad3c8684bc7d56d68dfd949c81991bd9fbd0c36c1f2f1ce4e
[info]  start install dependencies...
[info] Running script at: C:\Users\bians\AppData\Local\Programs\Eigent\resources\app.asar\resources\scripts\install-bun.js
[info] Script output: Using bun version: 1.2.9

[info] Script output: Installing bun 1.2.9 for win32-x64 (baseline)...

[info] Script output: Downloading bun 1.2.9 for win32-x64-baseline...
URL: https://github.com/oven-sh/bun/releases/download/bun-v1.2.9/bun-windows-x64-baseline.zip

[error] Script error: Error installing bun for win32-x64-baseline: connect ETIMEDOUT 20.205.243.166:443

[info] Script output: Downloading bun from gitcode.com

[info] Script output: Downloading bun 1.2.9 for win32-x64-baseline...
URL: https://gitcode.com/CherryHQ/bun/releases/download/bun-v1.2.9/bun-windows-x64-baseline.zip

[info] Script output: Extracting bun-windows-x64-baseline.zip to C:\Users\bians\.eigent\bin...

[error] Script error: Error installing bun for win32-x64-baseline: connect ETIMEDOUT 20.205.243.166:443

[info] Script output: Downloading bun from gitcode.com

[info] Script output: Downloading bun 1.2.9 for win32-x64-baseline...

[info] Script output: URL: https://gitcode.com/CherryHQ/bun/releases/download/bun-v1.2.9/bun-windows-x64-baseline.zip

[info] Script output: Successfully installed bun 1.2.9 for win32-x64-baseline
Installation successful

[info] Script completed successfully
[error] Script error: Downloading cpython-3.10.16-windows-x86_64-none (21.4MiB)

[error] Script error: node:fs:1667
  const stats = binding.stat(
                        ^

Error: ENOENT: no such file or directory, stat 'C:\Users\bians\AppData\Local\Temp\bun-windows-x64-baseline.zip'
    at statSync (node:fs:1667:25)
    at t.statSync (node:electron/js2c/node_init:2:5397)
    at WriteStream.<anonymous> (file:///C:/Users/bians/AppData/Local/Programs/Eigent/resources/app.asar/resources/scripts/download.js:53:32)
    at WriteStream.emit (node:events:518:28)
    at emitCloseNT (node:internal/streams/destroy:147:10)
    at process.processTicksAndRejections (node:internal/process/task_queues:81:21) {
  errno: -4058,
  code: 'ENOENT',
  syscall: 'stat',
  path: 'C:\\Users\\bians\\AppData\\Local\\Temp\\bun-windows-x64-baseline.zip'
}

Node.js v20.18.3

[error] Script exited with code 1
[error] Script error:  Downloaded cpython-3.10.16-windows-x86_64-none

[error] Script error: Using CPython 3.10.16

[error] Script error: Creating virtual environment at: .venv

[error] Script error: Resolved 199 packages in 6ms

[error] Script error: Downloading camel-ai (1.3MiB)

[error] Script error: Downloading pydantic-core (1.9MiB)

[error] Script error: Downloading reportlab (1.9MiB)

[error] Script error: Downloading numpy (12.3MiB)

[error] Script error: Downloading 
[error] Script error: pillow
[error] Script error:  
[error] Script error: (2.4MiB)

[error] Script error: Downloading pdfminer-six (5.4MiB)

[error] Script error: Downloading lxml (3.8MiB)

[error] Script error: Downloading fonttools (2.2MiB)

[error] Script error: Downloading cryptography
[error] Script error:  (3.4MiB)

[error] Script error: Downloading sympy (6.0MiB)

[error] Script error: Downloading av (29.9MiB)

[error] Script error: Downloading yt-dlp 
[error] Script error: (3.0MiB)

[error] Script error: Downloading onnxruntime (10.6MiB)
Downloading nodejs-wheel-binaries (38.2MiB)
[error] Script error: 
Downloading pandas (9.9MiB)

[error] Script error: Downloading botocore 
[error] Script error: (13.4MiB)
[error] Script error: 

[error] Script error: Downloading babel (9.7MiB)

[error] Script error: Downloading pyarrow (25.0MiB)
Downloading pywin32 (9.1MiB)
Downloading grpcio (4.4MiB)

[error] Script error: Downloading speechrecognition (31.3MiB)

[error] Script error: Downloading magika 
[error] Script error: (11.8MiB)

[error] Script error: Downloading google-api-python-client (12.6MiB)

[error] Script error: Downloading youtube-transcript-api (2.1MiB)

[error] Script error:  Downloaded camel-ai

[error] Script error:    Building docx==0.2.4

[error] Script error:    Building wikipedia==1.4.0

[error] Script error:    Building sgmllib3k==1.0.0

[error] Script error:  Downloaded pydantic-core

[error] Script error:  Downloaded reportlab

[error] Script error:    Building pylatex==1.4.2

[error] Script error:  Downloaded youtube-transcript-api

[error] Script error:  Downloaded pillow

[error] Script error:  Downloaded fonttools

[error] Script error:  Downloaded cryptography

[error] Script error:  Downloaded lxml

[error] Script error:  Downloaded yt-dlp

[error] Script error:  Downloaded grpcio

[error] Script error:  Downloaded pdfminer-six

[error] Script error:       Built sgmllib3k==1.0.0

[error] Script error:       Built wikipedia==1.4.0

[error] Script error:       Built docx==0.2.4

[error] Script error:       Built pylatex==1.4.2

[error] Script error:  Downloaded sympy

[error] Script error:  Downloaded pywin32

[error] Script error:  Downloaded babel

[error] Script error:  Downloaded onnxruntime

[error] Script error:  Downloaded magika

[error] Script error:  Downloaded numpy

[error] Script error:  Downloaded google-api-python-client
[error] Script error: 

[error] Script error:  Downloaded pandas

[error] Script error:  Downloaded botocore

[error] Script error:  Downloaded pyarrow

[error] Script error:  Downloaded av

[error] Script error:  Downloaded speechrecognition

[error] Script error:  Downloaded nodejs-wheel-binaries

[error] Script error: Prepared 189 packages in 3m 13s

[error] Script error: Installed 189 packages in 17.10s

[error] Script error:  + aiofiles==24.1.0
 + aiohappyeyeballs==2.6.1
 + aiohttp==3.12.15
 + aiosignal==1.4.0
 + annotated-types==0.7.0
[error] Script error: 
 + anthropic==0.49.0
 + anyio==4.11.0
 + asgiref==3.9.2
 + async-timeout==5.0.1
 + attrs==25.3.0
 + av==15.1.0
 +
[error] Script error:  azure-ai-documentintelligence==1.0.2
 + azure-core==1.35.1
 + azure-identity==1.25.0
 + babel==2.17.0
 + beautifulsoup4==4.13.5
 + boto3==1.40.39
 + botocore==1.40.39
 + cachetools==5.5.2
 + camel-ai==0.2.76a7
 + certifi==2025.8.3
 + cffi==2.0.0
 + chardet==5.2.0
 + charset-normalizer==3.4.3
 + click==8.2.1
 + cobble==0.1.4
[error] Script error: 
 + colorama==0.4.6
[error] Script error: 
 + coloredlogs==15.0.1
 + cryptography==46.0.1
 + cssutils==2.11.1
 + currency-symbols==2.0.4
 + datasets==3.6.0
 + defusedxml==0.7.1
 + dill==0.3.8
 + distro==1.9.0
 + docstring-parser==0.17.0
 + docx==0.2.4
 + et-xmlfile==2.0.0
 + exa-py==1.15.6
 + exceptiongroup==1.3.0
 + fastapi==0.117.1
 + fastapi-babel==1.0.0
 + feedparser==6.0.12
 + ffmpeg-python==0.2.0
 + filelock==3.19.1
 + flatbuffers==25.9.23
 + fonttools==4.60.0
 + frozenlist==1.7.0
 + fsspec==2025.3.0
[error] Script error: 
 + future==1.0.0
 + google-api-core==2.25.1
 + google-api-python-client==2.166.0
 + google-auth==2.40.3
 + google-auth-httplib2==0.2.0
 + googleapis-common-protos==1.70.0
 + grpcio==1.75.1
 + h11==0.16.0
 + html5lib==1.1
[error] Script error: 
 + httpcore==1.0.9
 + httplib2==0.31.0
 + httptools==0.6.4
 + httpx==0.28.1
 + httpx-sse==0.4.1
 + huggingface-hub==0.35.1
 + humanfriendly==10.0
 + idna==3.10
 + imageio==2.37.0
 + importlib-metadata==8.7.0
 + inflection==0.5.1
 + isodate==0.7.2
 + jiter==0.11.0
 + jmespath==1.0.1
 + jsonschema==4.25.1
 + jsonschema-specifications==2025.9.1
 + loguru==0.7.3
 + lxml==6.0.2
 + magika==0.6.2
 + mammoth==1.10.0
 + markdownify==1.2.0
 + markitdown==0.1.3
 + mcp==1.15.0
 + mcp-server-fetch==2025.1.17
 + mcp-simple-arxiv==0.2.2
 + more-itertools==10.8.0
 + mpmath==1.3.0
 + msal
[error] Script error: ==1.34.0
 + msal-extensions==1.3.1
 + multidict==6.6.4
 + multiprocess==0.70.16
 + nodejs-wheel==22.19.0
 + nodejs-wheel-binaries==22.19.0
 + numpy==2.2.0
 + oauthlib==3.3.1
 + olefile==0.47
 + onnxruntime==1.19.2
 + openai==1.109.1
 + openpyxl==3.1.5
 + opentelemetry-api==1.34.1
[error] Script error: 
 + opentelemetry-exporter-otlp==1.34.1
 + 
[error] Script error: opentelemetry-exporter-otlp-proto-common==1.34.1
 + opentelemetry-exporter-otlp-proto-grpc==1.34.1
[error] Script error: 
 + opentelemetry-exporter-otlp-proto-http==1.34.1
 + opentelemetry-instrumentation==0.55b1
 + opentelemetry-instrumentation-asgi==0.55b1
 + opentelemetry-instrumentation-fastapi==0.55b1
 + opentelemetry-propagator-aws-xray==1.0.2
 + opentelemetry-proto==1.34.1
 + opentelemetry-sdk==1.34.1
 + opentelemetry-sdk-extension-aws==2.1.0
[error] Script error: 
 + opentelemetry-semantic-conventions==0.55b1
 + opentelemetry-util-http==0.55b1
 + ordered-set==4.1.0
 + packaging==25.0
 + pandas
[error] Script error: ==1.5.3
 + pdfminer-six==20250506
 + pillow==10.4.0
 + platformdirs==4.4.0
 + propcache==0.3.2

[error] Script error:  
[error] Script error: + protego==0.5.0
 + proto-plus==1.26.1
 + protobuf==5.29.5
 + psutil==5.9.8
 + pyarrow==21.0.0
 + pyasn1==0.6.1
 + pyasn1-modules==0.4.2
 + pycparser==2.23
 + pydantic==2.11.9
 + pydantic-core==2.33.2
 + pydantic-i18n==0.4.5
 + 
[error] Script error: pydantic-settings==2.11.0
 + pydash==8.0.5
 + pydub==0.25.1
 + pyjwt==2.10.1
 + pylatex==1.4.2
 + pyparsing==3.2.5
 + pyreadline3==3.5.4
 + pytesseract==0.3.13
 +
[error] Script error:  python-dateutil==2.9.0.post0
 + python-dotenv==1.1.1
 +
[error] Script error:  python-multipart==0.0.20
 + python-pptx==1.0.2
 + pytz==2025.2
 + pywin32==311
 + pyyaml==6.0.2
 + readabilipy==0.3.0
 + referencing==0.36.2
 + regex==2025.9.18
 + 
[error] Script error: reportlab==4.4.4
 + requests==2.32.5
 + requests-oauthlib==1.3.1
 + rpds-py==0.27.1
 + rsa==4.9.1
 + s3transfer==0.14.0
 + scenedetect
[error] Script error: ==0.6.7.1
 + sgmllib3k==1.0.0
 + six==1.17.0
 + slack-sdk==3.36.0
 + sniffio==1.3.1
 + socksio==1.0.0
 + soupsieve==2.8
 + speechrecognition==3.14.3
 + sse-starlette==3.0.2
 + starlette==0.48.0
 + sympy==1.14.0
 + tabulate==0.9.0
 + tiktoken==0.7.0
 + tqdm==4.67.1
 + traceroot==0.0.5
 + typing-extensions==4.15.0
 + typing-inspection==0.4.1

[error] Script error:  + uritemplate==4.2.0
 + urllib3==2.5.0
 + uvicorn==
[error] Script error: 0.37.0
 + watchfiles==1.1.0
 + watchtower==3.4.0
 + webcolors==24.11.1
 + webencodings==0.5.1
 + websockets==15.0.1
 + wikipedia==1.4.0
 + win32-setctime==1.2.0
 + wrapt==1.17.3
 + xlrd==2.0.2
 
[error] Script error: + xls2xlsx==0.2.0
 + xlsxwriter==3.2.9
 + xxhash==3.5.0
 + yarl==1.20.1
 + youtube-transcript-api==1.0.3
 + yt-dlp==2024.12.23
 + zipp==3.23.0

[info] Script completed successfully
[info]  install dependencies complete
[info] Checking and starting backend service...
[info] Tool installed, starting backend service...
[info] Found available port: 5001
[info] Backend service started successfully { port: 5001 }
[info] BACKEND: Installed 7 packages in 1.11s
[info] BACKEND: 2025-11-05 18:21:35.413 | INFO     | main:<module>:14 - Starting Eigent Multi-Agent System API
2025-11-05 18:21:35.413 | INFO     | main:<module>:15 - Python encoding: utf-8
[info] BACKEND: 2025-11-05 18:21:35.413 | INFO     | main:<module>:16 - Environment: development
2025-11-05 18:21:35.416 | INFO     | main:<module>:19 - Loading routers with prefix: ''
[info] BACKEND: 2025-11-05 18:22:58.658 | INFO     | main:<module>:21 - All routers loaded successfully
[info] BACKEND: 2025-11-05 18:22:58.673 | INFO     | main:<module>:34 - Loguru configured with log file: C:\Users\bians/.eigent/runtime/log/app.log
2025-11-05 18:22:58.677 | INFO     | main:<module>:52 - PID write task created
2025-11-05 18:22:58.677 | INFO     | main:<module>:104 - Application initialization completed successfully
INFO:     Started server process [23432]
INFO:     Waiting for application startup.
[info] BACKEND: INFO:     Application startup complete.
[info] BACKEND: INFO:     Uvicorn running on http://127.0.0.1:5001 (Press CTRL+C to quit)
[info] BACKEND: 2025-11-05 18:22:59.119 | INFO     | main:write_pid_file:47 - PID file written: 23432
[info] BACKEND: INFO:     127.0.0.1:61705 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:56474 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:53039 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:62978 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:61622 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:55805 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:61691 - "POST /model/validate HTTP/1.1" 200 OK
[info] BACKEND: INFO:     127.0.0.1:65488 - "POST /model/validate HTTP/1.1" 200 OK
[error] BACKEND: 2025-11-05 22:27:57,153 - camel.models.model_manager - ERROR - Error processing with model: <camel.models.openai_compatible_model.OpenAICompatibleModel object at 0x0000022911F3CD00>
[error] BACKEND: 2025-11-05 22:27:57,155 - camel.camel.agents.chat_agent - ERROR - Model error: gemini-2.5-pro
Traceback (most recent call last):
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\agents\chat_agent.py", line 2060, in _get_model_response
    response = self.model_backend.run(
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\model_manager.py", line 239, in run
    raise exc
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\model_manager.py", line 229, in run
    response = self.current_model.run(messages, response_format, tools)
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\base_model.py", line 71, in wrapped_run
    return original_run(self, messages, *args, **kwargs)
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\base_model.py", line 428, in run
    result = self._run(messages, response_format, tools)
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\openai_compatible_model.py", line 202, in _run
    result = self._request_chat_completion(messages, tools)
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\models\openai_compatible_model.py", line 281, in _request_chat_completion
    return self._client.chat.completions.create(
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\openai\_utils\_utils.py", line 286, in wrapper
    return func(*args, **kwargs)
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\openai\resources\chat\completions\completions.py", line 1147, in create
    return self._post(
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\openai\_base_client.py", line 1259, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\openai\_base_client.py", line 1047, in request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'gemini-2.5-pro is not a valid model ID', 'code': 400}, 'user_id': 'user_2xlXjrdbc5pmRsI4mMnkZiffs4o'}
[info] BACKEND: INFO:     127.0.0.1:64530 - "POST /model/validate HTTP/1.1" 200 OK
[info] Starting new task
[info] BACKEND: INFO:     127.0.0.1:51443 - "POST /chat HTTP/1.1" 200 OK
[error] BACKEND: 2025-11-05 22:48:08,758 - camel.camel.toolkits.mcp_toolkit - ERROR - Connection timeout after 20s. One or more MCP servers are not responding. Please check if the servers are running and accessible.
[error] BACKEND: 2025-11-05 22:48:08.774 | WARNING  | app.utils.agent:get_mcp_tools:1484 - Failed to connect MCP toolkit: MCPConnectionError('Connection timeout after 20s. One or more MCP servers are not responding. Please check if the servers are running and accessible.')
[info] Starting new task
[info] BACKEND: INFO:     127.0.0.1:56607 - "POST /chat HTTP/1.1" 200 OK
[info] before-quit
[info] quit python_process.pid: undefined
[info] window-all-closed
[info] before-quit
[info] quit python_process.pid: undefined
[info] Attempting to kill process on port: 5001
[info] Attempting to kill process on port: 5001
[info]  start check version { currentVersion: '0.0.71' }
[info]  read saved version { savedVersion: '0.0.71' }
[info]  version not changed, skip install dependencies { currentVersion: '0.0.71' }
[info] Checking and starting backend service...
[info] Tool installed, starting backend service...
[info] Found port from file: 5001
[error] Failed to kill process on port 9222: Error: Command failed: taskkill /F /PID 14676
    : û   ҵ      "14676"  

    at genericNodeError (node:internal/errors:984:15)
    at wrappedFn (node:internal/errors:538:14)
    at ChildProcess.exithandler (node:child_process:422:12)
    at ChildProcess.emit (node:events:518:28)
    at maybeClose (node:internal/child_process:1104:16)
    at ChildProcess._handle.onexit (node:internal/child_process:304:5)
[info] Found available port: 5001
[info] Backend service started successfully { port: 5001 }
[info] BACKEND: 2025-11-05 22:56:03.435 | INFO     | main:<module>:14 - Starting Eigent Multi-Agent System API
2025-11-05 22:56:03.436 | INFO     | main:<module>:15 - Python encoding: utf-8
[info] BACKEND: 2025-11-05 22:56:03.436 | INFO     | main:<module>:16 - Environment: development
2025-11-05 22:56:03.436 | INFO     | main:<module>:19 - Loading routers with prefix: ''
[info] BACKEND: 2025-11-05 22:56:15.978 | INFO     | main:<module>:21 - All routers loaded successfully
[info] BACKEND: 2025-11-05 22:56:15.990 | INFO     | main:<module>:34 - Loguru configured with log file: C:\Users\bians/.eigent/runtime/log/app.log
[info] BACKEND: 2025-11-05 22:56:15.999 | INFO     | main:<module>:52 - PID write task created
2025-11-05 22:56:16.000 | INFO     | main:<module>:104 - Application initialization completed successfully
[info] BACKEND: INFO:     Started server process [26252]
[info] BACKEND: INFO:     Waiting for application startup.
[info] BACKEND: INFO:     Application startup complete.
[info] BACKEND: INFO:     Uvicorn running on http://127.0.0.1:5001 (Press CTRL+C to quit)
[info] BACKEND: 2025-11-05 22:56:16.109 | INFO     | main:write_pid_file:47 - PID file written: 26252
[info] Starting new task
[info] BACKEND: INFO:     127.0.0.1:55798 - "POST /chat HTTP/1.1" 200 OK
[error] BACKEND: 2025-11-05 22:59:20,436 - camel.camel.toolkits.mcp_toolkit - ERROR - Connection timeout after 20s. One or more MCP servers are not responding. Please check if the servers are running and accessible.
[error] BACKEND: 2025-11-05 22:59:20.438 | WARNING  | app.utils.agent:get_mcp_tools:1484 - Failed to connect MCP toolkit: MCPConnectionError('Connection timeout after 20s. One or more MCP servers are not responding. Please check if the servers are running and accessible.')
[info] BACKEND: Downloading pywin32
[info] BACKEND:  (9.1MiB)
[info] BACKEND: Downloading pydantic-core (1.9MiB)
[info] BACKEND: Downloading cryptography (3.4MiB)
[info] BACKEND:  Downloading pydantic-core
[info] BACKEND:  Downloading cryptography
[info] BACKEND:  Downloading
[info] BACKEND: pywin32
[info] BACKEND: 
[info] BACKEND: Installed 46 packages in 4.93s
[info] BACKEND: 2025-11-05 23:00:26.642 | INFO     | app.service.chat_service:summary_task:320 - summary_task: LLM Minimal Output Test|Verify LLM functionality by returning exactly one plain-text line containing LLM works properly, with no searches or tool calls and no extra characters, quotes, spaces, or lines.
[info] BACKEND: INFO:     127.0.0.1:50460 - "PUT /task/1762354557318-2929 HTTP/1.1" 201 Created
[info] BACKEND: INFO:     127.0.0.1:50460 - "POST /task/1762354557318-2929/start HTTP/1.1" 201 Created
[error] BACKEND: 2025-11-05 23:02:43.584 | ERROR    | app.utils.server.sync_step:send_to_api:48 -
[info] BACKEND: [33mWorker node 6fc30629-72ad-49a7-b6b9-d7414f6b7905 (Document Agent: A document processing assistant skilled in creating and modifying a wide range of file formats. It can generate text-based files/reports (Markdown, JSON, YAML, HTML), office documents (Word, PDF), presentations (PowerPoint), and data files (Excel, CSV).) get task 1762354557318-2929.1: Produce a minimal report that consists only of the title LLM works properly. Deliverable: return a single plain-text line containing exactly LLM works properly with no additional text, no quotation marks, no leading/trailing spaces, and no extra lines.[39m
[33mWorker node 6fc30629-72ad-49a7-b6b9-d7414f6b7905 (Document Agent: A document processing assistant skilled in creating and modifying a wide range of file formats. It can generate text-based files/reports (Markdown, JSON, YAML, HTML), office documents (Word, PDF), presentations (PowerPoint), and data files (Excel, CSV).) get task 1762354557318-2929: Test if llm works properly. You are not required to do any searches or tool callings. Your report shall include only a title, titled "LLM works properly".[39m
INFO:     127.0.0.1:54023 - "PUT /task/1762354557318-2929 HTTP/1.1" 201 Created
[error] BACKEND: 2025-11-05 23:02:48.773 | ERROR    | app.utils.workforce:eigent_start:115 - Error in workforce execution: The workforce is running. Cannot perform the operation start.
[info] BACKEND: INFO:     127.0.0.1:54023 - "POST /task/1762354557318-2929/start HTTP/1.1" 201 Created
[info] Reading file: C:\Users\bians\eigent\heavysnowjakarta\task_1762354557318-2929\terminal_logs
[error] BACKEND: 2025-11-05 23:13:16,140 - asyncio - ERROR - Task exception was never retrieved
future: <Task finished name='Task-114' coro=<Workforce.eigent_start() done, defined at C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\app\utils\workforce.py:105> exception=RuntimeError('The workforce is running. Cannot perform the operation start.')>
Traceback (most recent call last):
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\app\utils\workforce.py", line 113, in eigent_start
    await self.start()
  File "C:\Users\bians\AppData\Local\Programs\Eigent\resources\backend\.venv\lib\site-packages\camel\societies\workforce\utils.py", line 187, in wrapper
    raise RuntimeError(error_msg)
RuntimeError: The workforce is running. Cannot perform the operation start.
[error] Failed to read file: C:\Users\bians\eigent\heavysnowjakarta\task_1762354557318-2929\terminal_logs Error: EISDIR: illegal operation on a directory, read
    at async readFileHandle (node:internal/fs/promises:553:24)
    at async file:///C:/Users/bians/AppData/Local/Programs/Eigent/resources/app.asar/dist-electron/main/index.js:14358:17
    at async WebContents.<anonymous> (node:electron/js2c/browser_init:2:87023)

Additional context

No response

HeavySnowJakarta avatar Nov 05 '25 15:11 HeavySnowJakarta