InnerEye-DeepLearning icon indicating copy to clipboard operation
InnerEye-DeepLearning copied to clipboard

Allow batch sizes > 1 for classification model inference.

Open Shruthi42 opened this issue 3 years ago • 1 comments

At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.

https://github.com/microsoft/InnerEye-DeepLearning/blob/daefdba6083775de7ca258d18ae315e57bcb54bd/InnerEye/ML/model_testing.py#L428

AB#3998

Shruthi42 avatar May 04 '21 15:05 Shruthi42

At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.

https://github.com/microsoft/InnerEye-DeepLearning/blob/daefdba6083775de7ca258d18ae315e57bcb54bd/InnerEye/ML/model_testing.py#L428

AB#3998

Hi Can I work on this?

hxri avatar Aug 25 '21 05:08 hxri