mmpose
mmpose copied to clipboard
[Docs] Batched inference in existing models?
📚 The doc issue
I can not find in the documentation if it is possible to do inference on batches of images (or an np.ndarray of rank 4) using the MMPoseInferencer. I can imagine it is beneficial for users to run inference in batches because of the obvious performance gains. But it is not clear how and if this is possible
Suggest a potential alternative/fix
A clear tutorial on how to do batched inference.