nnUNet
nnUNet copied to clipboard
Online softmax depracation and autograd
I have Python 3.12.11
When running nnUnet it seems like there are some deprecated calls.
/scratch/gpfs/nc1514/autotslabel/.venv/lib/python3.12/site-packages/fft_conv_pytorch/fft_conv.py:139: UserWarning: Using a non-tuple sequence for multidimensional indexing is deprecated and will be changed in pytorch 2.9; use x[tuple(seq)] instead of x[seq]. In pytorch 2.9 this will be interpreted as tensor index, x[torch.tensor(seq)], which will result either in an error or a different result (Triggered internally at /pytorch/torch/csrc/autograd/python_variable_indexing.cpp:345.)
output = output[crop_slices].contiguous()
/scratch/gpfs/nc1514/autotslabel/.venv/lib/python3.12/site-packages/torch/_inductor/lowering.py:7242: UserWarning:
Online softmax is disabled on the fly since Inductor decides to
split the reduction. Cut an issue to PyTorch if this is an
important use case and you want to speed it up with online
softmax.
Hi @nathanchenseanwalter,
thanks for raising these issues. I will make sure that this is taken care of. If you want you can also fix this yourself and make a pull request to get some credit :)
Best, Yannick
Here's my fix :)
https://github.com/MIC-DKFZ/batchgeneratorsv2/pull/14