InnerEye-DeepLearning
InnerEye-DeepLearning copied to clipboard
Allow batch sizes > 1 for classification model inference.
At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.
https://github.com/microsoft/InnerEye-DeepLearning/blob/daefdba6083775de7ca258d18ae315e57bcb54bd/InnerEye/ML/model_testing.py#L428
At the moment, we set batch size =1 in the dataloaders when running inference for a classification model.
https://github.com/microsoft/InnerEye-DeepLearning/blob/daefdba6083775de7ca258d18ae315e57bcb54bd/InnerEye/ML/model_testing.py#L428
Hi Can I work on this?