KiU-Net-pytorch icon indicating copy to clipboard operation
KiU-Net-pytorch copied to clipboard

bug: The size of tensor a (3) must match the size of tensor b (6) at non-singleton dimension 2

Open jiangjiaxi20 opened this issue 3 years ago • 12 comments

jiangjiaxi20 avatar Oct 26 '20 08:10 jiangjiaxi20

Can you please add more description for this error ? For which dataset and which model are you getting this bug ?

jeya-maria-jose avatar Oct 26 '20 15:10 jeya-maria-jose

i used the LIST data to train the model. got the .pth. then i run the val.py.( i did not change any parameters in parameter.py ,except those paths). Then i got the bug of models.py in line 602. "The size of tensor a (3) must match the size of tensor b (6) at non-singleton dimension 2". i tried to change the "size" of parameter.py to 8, there is no bug, but the results(pred.nii) seems bad. other size will result to similar bug. i also change the "size" of paramter.py to 24 when i run the train.py, cause i got similar bugs too.

jiangjiaxi20 avatar Oct 27 '20 01:10 jiangjiaxi20

yes,i have the same problem. the output size (1024) is not the same size as label(512)

xuzhongyou avatar Jan 15 '21 12:01 xuzhongyou

Trying to run train.py in LiTS folder after successfully run get_training_set.py in data_prepare, but got the following error when using kiunet_org as net in model.py: Traceback (most recent call last): File "train.py", line 71, in outputs = net(ct) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 159, in forward return self.module(*inputs[0], **kwargs[0]) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) File "/home/viplab/nas/KiU-Net-pytorch/LiTS/net/models.py", line 591, in forward out = torch.add(out,F.interpolate(F.relu(self.inte3_1bn(self.intere3_1(out1))),scale_factor=(0.5,0.0625,0.0625),mode ='trilinear')) RuntimeError: The size of tensor a (3) must match the size of tensor b (6) at non-singleton dimension 2

atch841 avatar Mar 02 '21 08:03 atch841

I have the same problem with running (almost) any of the models on my own data: out1 = torch.add(out1,F.interpolate(F.relu(self.inte1_2bn(self.intere1_2(tmp))),scale_factor=(4,4),mode ='bilinear')) RuntimeError: The size of tensor a (386) must match the size of tensor b (384) at non-singleton dimension 3

boneseva avatar Mar 10 '21 20:03 boneseva

I have the same problem,So how to solve it?

zx-123456 avatar Apr 07 '21 06:04 zx-123456

Have same problem kiunet_org.

momo1986 avatar Jun 15 '21 05:06 momo1986

I have the same issue has anybody solved it

AasiaRehman avatar Dec 27 '21 13:12 AasiaRehman

The same problem arose when I use kiunet_org for training

SorryMaker511 avatar Aug 08 '23 14:08 SorryMaker511

Did you guys solve the problem?

zhouyizhuo avatar Dec 01 '23 01:12 zhouyizhuo

Trying to run train.py in LiTS folder after successfully run get_training_set.py in data_prepare, but got the following error when using kiunet_org as net in model.py: Traceback (most recent call last): File "train.py", line 71, in outputs = net(ct) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 159, in forward return self.module(*inputs[0], **kwargs[0]) File "/home/viplab/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) File "/home/viplab/nas/KiU-Net-pytorch/LiTS/net/models.py", line 591, in forward out = torch.add(out,F.interpolate(F.relu(self.inte3_1bn(self.intere3_1(out1))),scale_factor=(0.5,0.0625,0.0625),mode ='trilinear')) RuntimeError: The size of tensor a (3) must match the size of tensor b (6) at non-singleton dimension 2

@atch841 @momo1986 Did you solve the problem?

zhouyizhuo avatar Dec 03 '23 13:12 zhouyizhuo

@SorryMaker511

zhouyizhuo avatar Dec 03 '23 13:12 zhouyizhuo