Detectron.pytorch icon indicating copy to clipboard operation
Detectron.pytorch copied to clipboard

Can we have batch size > 1 for inference?

Open nymph332088 opened this issue 6 years ago • 4 comments

Hi, when I run tools/test_net.py, I realize there is no place to set up batch size during inference.

  1. Does the default batch size = 1? I assume the image is processed one by one, please correctly me if I am wrong.
  2. Can we easily change the value to larger than one? I would like to have this feature, because my Titan Xp GPU is not fully utilized, with half of the memory is wasting.

Best regards

nymph332088 avatar Jun 21 '18 01:06 nymph332088

Theoretically, yes, but I don't plan to do that in near days. It requires some efforts.

roytseng-tw avatar Jun 26 '18 12:06 roytseng-tw

Hi @nymph332088, I have the same problem as yours. The ideal type of inference is using a similar mechanism as train_net_step.py, using a Dataloader, for example in [this project] (https://github.com/rwightman/pytorch-dpn-pretrained/blob/master/inference.py#L82). I am working on this now and maybe I will make a pull-request in the following month.

ZekunZh avatar Dec 05 '18 14:12 ZekunZh

hi@ZekunZh,我也碰到要多张inference的问题啦,您解决啦吗~

13070151771 avatar Dec 25 '18 02:12 13070151771

@nymph332088 hi,您解决啦吗,我也碰到要infer多张的问题了~

13070151771 avatar Dec 25 '18 02:12 13070151771