PaddleDetection icon indicating copy to clipboard operation
PaddleDetection copied to clipboard

ppvehicle替换轻量版模型时用官方的轻量模型,导出export_model时报错

Open uniquecdmx opened this issue 9 months ago • 0 comments

问题确认 Search before asking

  • [X] 我已经搜索过问题,但是没有找到解答。I have searched the question and found no related answer.

请提出你的问题 Please ask your question

执行导出脚本:官方下载的轻量模型时pdparams格式的,但pp-vehicle引用的模型时包含了pdmodel、pdiparams格式的模型文件,然后用export_model.py导出,过程中出现报错 1.先用pdparams预训练模型导出模型文件 2.再更改infer_cfg_ppvehicle.yml的det和mot的模型文件地址,pp_vehicle替换轻量版模型的思路是这样的吗 是的话应该怎么解决呢

(pp) E:\AI\PaddleDetection>python tools/export_model.py -c output_inference/ppyoloe_plus_crn_t_auxhead_320_60e_ppvehicle/ppyoloe_plus_crn_t_auxhead_320_60e_ppvehicle.yml -o weights=output_inference/ppyoloe_plus_crn_t_auxhead_320_60e_ppvehicle/model.pdparams trt=True --output_dir=output_inference -o TestReader.fuse_normalize=true Warning: Unable to use numba in PP-Tracking, please install numba, for example(python3.7): pip install numba==0.56.4 Warning: Unable to use numba in PP-Tracking, please install numba, for example(python3.7): pip install numba==0.56.4 [05/08 16:59:52] ppdet.utils.checkpoint INFO: The shape [1] in pretrained weight aux_head.scales.0.scale is unmatched with the shape [] in model aux_head.scales.0.scale. And the weight aux_head.scales.0.scale will not be loaded [05/08 16:59:52] ppdet.utils.checkpoint INFO: The shape [1] in pretrained weight aux_head.scales.1.scale is unmatched with the shape [] in model aux_head.scales.1.scale. And the weight aux_head.scales.1.scale will not be loaded [05/08 16:59:52] ppdet.utils.checkpoint INFO: The shape [1] in pretrained weight aux_head.scales.2.scale is unmatched with the shape [] in model aux_head.scales.2.scale. And the weight aux_head.scales.2.scale will not be loaded [05/08 16:59:52] ppdet.utils.checkpoint INFO: Finish loading model weights: output_inference/ppyoloe_plus_crn_t_auxhead_320_60e_ppvehicle/model.pdparams [05/08 16:59:52] ppdet.engine INFO: Export inference config file to output_inference\ppyoloe_plus_crn_t_auxhead_320_60e_ppvehicle\infer_cfg.yml Traceback (most recent call last): File "tools/export_model.py", line 118, in main() File "tools/export_model.py", line 114, in main run(FLAGS, cfg) File "tools/export_model.py", line 80, in run trainer.export(FLAGS.output_dir, for_fd=FLAGS.for_fd) File "E:\AI\PaddleDetection\ppdet\engine\trainer.py", line 1229, in export static_model, pruned_input_spec = self._get_infer_cfg_and_input_spec( File "E:\AI\PaddleDetection\ppdet\engine\trainer.py", line 1181, in _get_infer_cfg_and_input_spec input_spec, static_model.forward.main_program, File "D:\Tools\Program\Anaconda3\envs\pp\lib\site-packages\paddle\jit\dy2static\program_translator.py", line 768, in main_program raise_error_template("main_program")() File "D:\Tools\Program\Anaconda3\envs\pp\lib\site-packages\paddle\jit\dy2static\program_translator.py", line 704, in _raise_error raise RuntimeError(error_template.format(func=func_str)) RuntimeError: Can't call main_program when full_graph=False. Use paddle.jit.to_static(full_graph=True) instead.

环境:window11、nvidia 3060 laptop GPU CUDA 12.0 paddlepaddle 2.6.0.post120 python 3.8.18

uniquecdmx avatar May 08 '24 09:05 uniquecdmx