deep-person-reid icon indicating copy to clipboard operation
deep-person-reid copied to clipboard

Load ONNX model

Open HeChengHui opened this issue 2 years ago • 9 comments

I managed to export some models from the model zoo into ONNX format. However, I have difficulties getting it to work with torchreid. In torchtools.py, instead of torch.load(), I added checkpoint = onnx.load(fpath). This resulted in the following error:

File "yolov5_deepsort\reid models\deep-person-reid\torchreid\utils\torchtools.py", line 280, in load_pretrained_weights
    if 'state_dict' in checkpoint:
TypeError: argument of type 'ModelProto' is not iterable

Any advice?

HeChengHui avatar Apr 06 '22 02:04 HeChengHui

Hey @HeChengHui, any progress with loading ONNX model? @KaiyangZhou Would you give some advice about loading a ONNX model to speed up the process?

Thanks!

Rm1n90 avatar Jul 05 '22 08:07 Rm1n90

please have a look at https://github.com/KaiyangZhou/deep-person-reid/issues?q=onnx and see if you can find anything useful

I'll try to find some time to write a tutorial code since this issue has been asked many times

KaiyangZhou avatar Jul 05 '22 12:07 KaiyangZhou

@KaiyangZhou Thanks, looking forward for the tutorial!

Rm1n90 avatar Jul 16 '22 18:07 Rm1n90

too busy, sorry, don't count on me (my bad)

does this help https://pytorch.org/docs/stable/onnx.html#example-alexnet-from-pytorch-to-onnx?

KaiyangZhou avatar Jul 28 '22 13:07 KaiyangZhou

@KaiyangZhou Thanks for updating me, no worries :) I think Im able to convert the weight file to onnx without issue. Im confused where do I need to load the converted weight file. Would you please mark the palce I need to do it?

Thanks alot!

Rm1n90 avatar Jul 29 '22 08:07 Rm1n90

Im confused where do I need to load the converted weight file. Would you please mark the palce I need to do it?

First build the model with model = torchreid.models.build_model(). Then load the pretrained weights with torchreid.utils.load_pretrained_weights(model, weight_path). Please refer to the documentation for more https://kaiyangzhou.github.io/deep-person-reid/user_guide#fine-tune-a-model-pre-trained-on-reid-datasets. (I also just checked the Docs as my memory is a bit rusty)

KaiyangZhou avatar Jul 29 '22 08:07 KaiyangZhou

Good news @KaiyangZhou, @Rm1n90, @HeChengHui!

I have a working multibackend (ONNX, OpenVINO and TFLite) class for for the ReID models that I manged to export (mobilenet, resnet50 and osnet models) with my export script. My export pipeline is as follows: PT --> ONNX --> OpenVINO --> TFLite. osnet models fails in the OpenVINO export; mobilenet and resnet50 models go all the way through. Feel free to experiment with it, it is in working condition as shown by my CI pipeline. Don't forget to drop a PR if you have any improvements! :smile:

mikel-brostrom avatar Aug 06 '22 17:08 mikel-brostrom

@mikel-brostrom Thats great! I will work on TensorRT export and will submit a PR! just a question, Did you time the model in ONNX, OPENVINO and TFLITE to see how long will take the tracking to do the job compare to pytorch version?

Rm1n90 avatar Aug 09 '22 08:08 Rm1n90

Did you time the model in ONNX, OPENVINO and TFLITE to see how long will take the tracking to do the job compare to pytorch version?

Inference time for the different frameworks is highly dependent on which HW you run it on. The chosen export frameworks should be deployment-plaform specific.

  • ONNX is an all-around format for all type of CPUs
  • OpenVINO should be the way to go for Deep Learning inference on Intel CPUs, Intel integrated GPUs, and Intel Vision Processing Units (VPUs)
  • TFLITE is for mobile and IoT devices

mikel-brostrom avatar Aug 09 '22 09:08 mikel-brostrom