Ali Soultan
Ali Soultan
I defined the device = torch.device (‘cuda’) Same in inference.json file and noticeably, when I change the to “cpu” it works
the full error message after use of divisible padding, k = 64 and after epoch 2 ``` --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) File :44 File /opt/conda/lib/python3.10/site-packages/monai/inferers/utils.py:229, in sliding_window_inference(inputs,...
Is there away to validate model output while using patch_inference as it usually boost the overall performance ? previously I was able to do so with a tool like sahi...