anomalib icon indicating copy to clipboard operation
anomalib copied to clipboard

Inference in Batch

Open glucasol opened this issue 8 months ago • 1 comments

Hi everyone, I have trained a Padim model and exported to torch and openvino. I would like to know if it is possible to run inference in a batch (16, 32, an so on) images? If so, how could I do it?

glucasol avatar Oct 30 '23 13:10 glucasol

Hello. It seems like the export has dynamic axes enabled for batch: https://github.com/openvinotoolkit/anomalib/blob/d7e7d86411d106369e9cca53d8c148cba659493a/src/anomalib/deploy/export.py#L157-L165 So I think you could pass a batched input, but you'd need to modify the code as current inferencers do it file by file.

blaz-r avatar Nov 17 '23 08:11 blaz-r