nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

Online softmax depracation and autograd

Open nathanchenseanwalter opened this issue 2 months ago • 2 comments

I have Python 3.12.11

When running nnUnet it seems like there are some deprecated calls.

/scratch/gpfs/nc1514/autotslabel/.venv/lib/python3.12/site-packages/fft_conv_pytorch/fft_conv.py:139: UserWarning: Using a non-tuple sequence for multidimensional indexing is deprecated and will be changed in pytorch 2.9; use x[tuple(seq)] instead of x[seq]. In pytorch 2.9 this will be interpreted as tensor index, x[torch.tensor(seq)], which will result either in an error or a different result (Triggered internally at /pytorch/torch/csrc/autograd/python_variable_indexing.cpp:345.)
  output = output[crop_slices].contiguous()
/scratch/gpfs/nc1514/autotslabel/.venv/lib/python3.12/site-packages/torch/_inductor/lowering.py:7242: UserWarning: 
Online softmax is disabled on the fly since Inductor decides to
split the reduction. Cut an issue to PyTorch if this is an
important use case and you want to speed it up with online
softmax.

nathanchenseanwalter avatar Oct 23 '25 18:10 nathanchenseanwalter

Hi @nathanchenseanwalter,

thanks for raising these issues. I will make sure that this is taken care of. If you want you can also fix this yourself and make a pull request to get some credit :)

Best, Yannick

ykirchhoff avatar Oct 29 '25 10:10 ykirchhoff

Here's my fix :)

https://github.com/MIC-DKFZ/batchgeneratorsv2/pull/14

vmiller987 avatar Nov 13 '25 13:11 vmiller987