PaddleX
PaddleX copied to clipboard
Paddlex安装hpi以后调用PP_StructureV3报错
trafficstars
Checklist:
- [X] 查找历史相关issue寻求解答
- [X] 翻阅FAQ
- [X] 翻阅PaddleX 文档
- [X] 确认bug是否在新版本里还未修复
描述问题
- 使用docker镜像部署在ks中
镜像信息:
[paddle:3.0.0-gpu-cuda11.8-cudnn8.9-trt8.6](http://ccr-2vdh3abv-pub.cnc.bj.baidubce.com/paddlepaddle/paddle:3.0.0-gpu-cuda11.8-cudnn8.9-trt8.6)
显卡信息
Fri Jun 13 02:52:02 2025
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.230.02 Driver Version: 535.230.02 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA A40 Off | 00000000:AF:00.0 Off | 0 |
| 0% 41C P0 78W / 300W | 6397MiB / 46068MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
+---------------------------------------------------------------------------------------+
- 版面分析执行的命令:
paddlex --pipeline PP-StructureV3 --input https://paddle-model-ecology.bj.bcebos.com/paddlex/imgs/demo_image/demo_paper.png --use_hpip
- 报错
File "/root/PaddleX/paddlex/inference/models/image_classsification/predictor.py", line 49, in _ init
self.preprocessors, self.infer, self.postprocessors= self._build()
File "/root/PaddleX/paddlex/inference/models/image_classification/predictor.py",line 82, in _build
infer= self.create_static_infer()
/return HPInfer(
File "/root/Paddlex/paddlex/inference/models/base/predictor/base_predictorin create_static_infer
File "/root/PaddleX/paddlex/utils/deps.py", line 148, in _ rapper
return old_init_func(self, *args, **kwargs)
File "/root/PaddleX/paddlex/inference/models/common/static_iinfer.py",line 575, in in init_
File "/root/PaddleX/paddlex/inference/models/common/static_infer.py", line 630, in _determine_backend_config
backend, backend_config = self._determine_backend_and_config()
raise RuntimeError(
RuntimeError: No inference backend and confiauration could be suggested, Reason: "PP-LCNJet x1 0 textline ori' is not a known model.