mmtracking icon indicating copy to clipboard operation
mmtracking copied to clipboard

How to use get checkpoint/deploy/serve model for DeepSort/Tracktor

Open castrovictor opened this issue 2 years ago • 2 comments

Hello, I have a question. How can I use DeepSort apart from executing tools/test.py? For example mmdetector gives a checkpoint that can be converted to TensorRT. Asking because here it seems that we need a checkpoint, but I do not know what checkpoint Deepsort produces

castrovictor avatar Apr 19 '22 06:04 castrovictor

The checkpoints of DeepSORT are composed of a detector checkpoint and a reid checkpoint. We use them by wrapping them into the init_cfg in the Deepsort config. Please refer to this: https://github.com/open-mmlab/mmtracking/blob/master/configs/mot/deepsort/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py#L14

JingweiZhang12 avatar Apr 20 '22 02:04 JingweiZhang12

The checkpoints of DeepSORT are composed of a detector checkpoint and a reid checkpoint. We use them by wrapping them into the init_cfg in the Deepsort config. Please refer to this: https://github.com/open-mmlab/mmtracking/blob/master/configs/mot/deepsort/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py#L14

Thanks for your response. I saw that and I was able to make it work with different checkpoints. My question is regarding using/deploying the model. Let's say I have a video and I want to run DeepSort on it. How do I do that? I saw model serving in the docs, but the script ask for "checkpoint". That's why I am confused.

Edit: I found how to feed it with a video, for some reason I did not see it yesterday. But I am still interested in knowing if it is possible to serve DeepSort with PyTorchServe as described in the link above.

castrovictor avatar Apr 20 '22 06:04 castrovictor