Detectron.pytorch
Detectron.pytorch copied to clipboard
Can we have batch size > 1 for inference?
Hi, when I run tools/test_net.py, I realize there is no place to set up batch size during inference.
- Does the default batch size = 1? I assume the image is processed one by one, please correctly me if I am wrong.
- Can we easily change the value to larger than one? I would like to have this feature, because my Titan Xp GPU is not fully utilized, with half of the memory is wasting.
Best regards
Theoretically, yes, but I don't plan to do that in near days. It requires some efforts.
Hi @nymph332088, I have the same problem as yours. The ideal type of inference is using a similar mechanism as train_net_step.py, using a Dataloader, for example in [this project] (https://github.com/rwightman/pytorch-dpn-pretrained/blob/master/inference.py#L82). I am working on this now and maybe I will make a pull-request in the following month.
hi@ZekunZh,我也碰到要多张inference的问题啦,您解决啦吗~
@nymph332088 hi,您解决啦吗,我也碰到要infer多张的问题了~