mmpose
mmpose copied to clipboard
HOW TO INFERENCE ONNX MODEL ABOUT HRNET POSE ESTIMATION
I convert model hrnet of pose estimation: hrnet_w32_coco_256x192.py with checkpoint: hrnet_w32_coco_256x192-c78dce93_20200708.pth to onnx model. Can you tell me how to inference model onnx?
A basic example can be found in the verification part of the script tools/deployment/python2onnx.py
: https://github.com/open-mmlab/mmpose/blob/master/tools/deployment/pytorch2onnx.py#L77
You can also refer to MMDeploy for more information. We recommend using MMDeploy for the export and deployment of MMPose models. If you are using MMPose 0.x version, please use MMDeploy 0.x as well (docs: https://mmdeploy.readthedocs.io/en/latest/04-supported-codebases/mmpose.html). And if you are using MMPose 1.x (recommended), please use MMDeploy 1.x (docs: https://mmdeploy.readthedocs.io/en/dev-1.x/04-supported-codebases/mmpose.html)
A basic example can be found in the verification part of the script
tools/deployment/python2onnx.py
: https://github.com/open-mmlab/mmpose/blob/master/tools/deployment/pytorch2onnx.py#L77You can also refer to MMDeploy for more information. We recommend using MMDeploy for the export and deployment of MMPose models. If you are using MMPose 0.x version, please use MMDeploy 0.x as well (docs: https://mmdeploy.readthedocs.io/en/latest/04-supported-codebases/mmpose.html). And if you are using MMPose 1.x (recommended), please use MMDeploy 1.x (docs: https://mmdeploy.readthedocs.io/en/dev-1.x/04-supported-codebases/mmpose.html)
What about for dynamic input shape?
MMDeploy supports model export with dynamic input size. An example (for a classification model) can be found here: https://github.com/open-mmlab/mmdeploy/blob/master/configs/mmcls/classification_dynamic.py
Please note that if a top-down pose model is used, the model itself may need a fixed size of input images and dynamic input size may lead to model crashes or incorrect results.
I convert model hrnet of pose estimation: hrnet_w32_coco_256x192.py with checkpoint: hrnet_w32_coco_256x192-c78dce93_20200708.pth to onnx model. Can you tell me how to inference model onnx?
How to convert "pth" to "oonx"? The current version of mmpose does not support it anymore?
MMDeploy supports model export with dynamic input size. An example (for a classification model) can be found here: https://github.com/open-mmlab/mmdeploy/blob/master/configs/mmcls/classification_dynamic.py
Please note that if a top-down pose model is used, the model itself may need a fixed size of input images and dynamic input size may lead to model crashes or incorrect results.
I see MMDeploy, but i read some config for converting model to onnx. But link: https://github.com/open-mmlab/mmdeploy/tree/master/configs/mmpose, i dont see config for dynamic size. I want to infer some image so i want onnx model for batch size. I have converted onnx with one image.Please help me!
@lamhust2008 If you are using 0.x version of MMPose and MMDeploy, please refer to #1090 for using dynamic shape. For 1.x version, the following docs may be what you need: https://mmdeploy.readthedocs.io/en/1.x/02-how-to-run/write_config.html?highlight=dynamic#if-you-need-to-use-dynamic-axes