vmiller987
vmiller987
> Hey, > > This is definitely a tricky one—we haven’t been able to reproduce it on our end. Based on the text file you shared, it seems the issue...
> I'm not sure if this is completely relevant, but I managed to avoid a similar dead worker error by taking lines 38-40 in nnunetv2/training/dataloading/utils.py > > np.load(data_npy, mmap_mode='r') >...
@mrokuss I did not provide you with a log for the command you wanted me to add. It does solve the wall of warnings, but does not resolve the background...
@FabianIsensee Thank you for the input! We're on opposite sides of the world, so I'm going to give a short answer, and work on a longer more detailed answer tomorrow....
I am happy to report that my issues have been resolved but I feel like a complete novice over the final solution. 1) Preprocessing has been resolved (#2792 ). I...
I updated the title to include Python 3.11. Are you able to replicate the issue using this version of Python? I was able to replicate this bug with more than...
Unfortunately, it persists. I must add `export nnUNet_n_proc_DA=0` in order for it to train. As this issue persists with the Hippocampus dataset, does @mrokuss still want information on our datasets?...
`export nnUNet_n_proc_DA=0` Try setting this to zero. This solves it for me sometimes.
> `export nnUNet_n_proc_DA=0` > > Try setting this to zero. This solves it for me sometimes. This is only an issue on Python 3.11 for me. Switching to Python 3.12...
Could you be more specific on what the issue is with nnUNet's code for FIPS compliance? It runs on FIPS enabled machines. #2749 Was closed as they removed the dependency...