PUGAN-pytorch
PUGAN-pytorch copied to clipboard
cannot use EMD_loss
I tried to use "get_emd_loss" for training but it seems there is a problem with backpropagation.
loss.backward()
File "/home/miniconda3/envs/torch_3d/lib/python3.6/site-packages/torch/tensor.py", line 195, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/home/miniconda3/envs/torch_3d/lib/python3.6/site-packages/torch/autograd/init.py", line 99, in backward allow_unreachable=True) # allow_unreachable flag RuntimeError: Expected isFloatingType(grads[i].type().scalarType()) to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)
can you specify what is your training dataset? And I think you should try to convert all your input tensor to float type by simply .float(). The information that you provided is limited and I cannot reproduce this currently.