lora-scripts icon indicating copy to clipboard operation
lora-scripts copied to clipboard

linux环境下,使用wd标签器一键打标,卡着不动

Open lidisi8520 opened this issue 10 months ago • 12 comments

如图。

Image

使用wd标签器,选择图片路径之后,一键打标,卡死不动, 不知道是什么原因, 也不报错,也没有提示。

lidisi8520 avatar Feb 19 '25 05:02 lidisi8520

@lidisi8520 SD-Trainer 默认不会安装 onnxruntime(参考:Commit 935f78f),导致 WD 标签器打标时会卡住 ,可自行使用 Pip 安装 onnxruntime-gpu 解决

licyk avatar Feb 27 '25 14:02 licyk

Loading wd14-vit-v3 model file from SmilingWolf/wd-vit-tagger-v3 ERROR: Exception in ASGI application

  • Exception Group Traceback (most recent call last): | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi | result = await app( # type: ignore[func-returns-value] | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in call | return await self.app(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/fastapi/applications.py", line 276, in call | await super().call(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/applications.py", line 122, in call | await self.middleware_stack(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in call | raise exc | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in call | await self.app(scope, receive, _send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/base.py", line 106, in call | async with anyio.create_task_group() as task_group: | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 772, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Exception Group Traceback (most recent call last): | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/base.py", line 109, in call | await response(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/responses.py", line 270, in call | async with anyio.create_task_group() as task_group: | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 772, in aexit | raise BaseExceptionGroup( | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/responses.py", line 273, in wrap | await func() | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/base.py", line 134, in stream_response | return await super().stream_response(send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/responses.py", line 262, in stream_response | async for chunk in self.body_iterator: | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/base.py", line 98, in body_stream | raise app_exc | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/base.py", line 70, in coro | await self.app(scope, receive_or_disconnect, send_no_error) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in call | raise exc | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in call | await self.app(scope, receive, sender) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in call | raise e | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in call | await self.app(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/routing.py", line 718, in call | await route.handle(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle | await self.app(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/routing.py", line 69, in app | await response(scope, receive, send) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/responses.py", line 174, in call | await self.background() | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/background.py", line 43, in call | await task() | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/background.py", line 28, in call | await run_in_threadpool(self.func, *self.args, **self.kwargs) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool | return await anyio.to_thread.run_sync(func, *args) | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync | return await get_async_backend().run_sync_in_worker_thread( | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2470, in run_sync_in_worker_thread | return await future | File "/home/lihan/anaconda3/envs/lora/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 967, in run | result = context.run(func, *args) | File "/home/lihan/lora-scripts/mikazuki/tagger/interrogator.py", line 364, in on_interrogate | ratings, tags = interrogator.interrogate(image) | File "/home/lihan/lora-scripts/mikazuki/tagger/interrogator.py", line 163, in interrogate | self.load() | File "/home/lihan/lora-scripts/mikazuki/tagger/interrogator.py", line 142, in load | from onnxruntime import InferenceSession | ImportError: cannot import name 'InferenceSession' from 'onnxruntime' (unknown location) +------------------------------------ 安装了 onnxruntime-gpu仍然用不了@licyk

jingjing010 avatar Mar 31 '25 09:03 jingjing010

@jingjing010 你的 onnxruntime-gpu 版本可能不对,可以在进入虚拟环境后运行下面的命令去检查并修复

python -c "$(curl https://raw.githubusercontent.com/licyk/hub-action/refs/heads/main/tools/fix_onnxruntime_gpu.py)"

如果运行这个命令后没有解决问题,可以先将 onnxruntime-gpu 卸载

python -m pip uninstall onnxruntime-gpu -y

再运行上面的命令去重新安装

licyk avatar Mar 31 '25 09:03 licyk

@licyk 麻烦问下哪个版本的onnxruntime-gpu适配呢,我默认装的1.16.3版本

jingjing010 avatar Mar 31 '25 10:03 jingjing010

@jingjing010 这个需要根据 PyTorch 内 CUDA 和 CUDDN 的版本来确定,上面给的修复命令会根据这两个版本来确定适配的 onnxruntime-gpu,然后安装,你跑一下那个命令就行

licyk avatar Mar 31 '25 11:03 licyk

@licyk 访问不了啊0.0 (venv) (lora) [lihan@vnode5 lora-scripts]$ python -c "$(curl https://raw.githubusercontent.com/licyk/hub-action/refs/heads/main/tools/fix_onnxruntime_gpu.py)" % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0curl: (7) Failed connect to raw.githubusercontent.com:443; 拒绝连接

jingjing010 avatar Mar 31 '25 11:03 jingjing010

@jingjing010 你终端没配置代理,下载那个修复脚本失败了。自己看一下你的代理软件的代理服务器是什么,比如 v2rayN 的是http://127.0.0.1:10808,给终端设置代理的命令就是:

export HTTP_PROXY="http://127.0.0.1:10808"
export HTTPS_PROXY="http://127.0.0.1:10808"
export NO_PROXY="localhost,127.0.0.1,::1"

licyk avatar Mar 31 '25 12:03 licyk

@licyk 修复命令在powershell里运行会报错,请问应该怎么处理 File "<string>", line 80 print(PyTorch ^ SyntaxError: '(' was never closed

Magichiffon avatar Apr 15 '25 10:04 Magichiffon

@Magichiffon 先把修复脚本下载下来再执行

Invoke-WebRequest https://raw.githubusercontent.com/licyk/hub-action/refs/heads/main/tools/fix_onnxruntime_gpu.py -OutFile ./fix_onnxruntime_gpu.py
python ./fix_onnxruntime_gpu.py

licyk avatar Apr 15 '25 13:04 licyk

请问在Windows下如何修复这个错误?

alex9441 avatar May 28 '25 04:05 alex9441

@Magichiffon 先把修复脚本下载下来再执行

Invoke-WebRequest https://raw.githubusercontent.com/licyk/hub-action/refs/heads/main/tools/fix_onnxruntime_gpu.py -OutFile ./fix_onnxruntime_gpu.py
python ./fix_onnxruntime_gpu.py

@alex9441 参考这个的方法

licyk avatar May 28 '25 04:05 licyk

@Magichiffon 先把修复脚本下载下来再执行 Invoke-WebRequest https://raw.githubusercontent.com/licyk/hub-action/refs/heads/main/tools/fix_onnxruntime_gpu.py -OutFile ./fix_onnxruntime_gpu.py python ./fix_onnxruntime_gpu.py

@alex9441 参考这个的方法

运行 A启动脚本.bat 提示: WARNING: Skipping onnxruntime-gpu as it is not installed.

但在IDE里运行 gui.py 却跑得像兔子一样欢快。 总之,能运行了。

alex9441 avatar May 28 '25 05:05 alex9441