EDSR-PyTorch
EDSR-PyTorch copied to clipboard
when I train RCAN,something wrong:RuntimeError: Expected 4-dimensional input for 4-dimensional weight 3 3 1, but got 3-dimensional input of size [1, 184, 270] instead
python main.py --template RCAN --save RCAN_BIX2_G10R20P48 --scale 2 --reset --save_results --patch_size 96 and then
Traceback (most recent call last):
File "main.py", line 33, in
Same error on my machine
+1
how to solve it!!!help me
Same error on my machine
You could refer to #184 .
in my opinion,the error mentioned above is caused by the mismatch of the number of dimension. Specifically,the required dimension is 4 whilst 3 was given,thus you should debug all the lines remain in the error message step by step in order to monitor the change of variables‘ dimension during the training process.
I was successful with the fix from #184 referenced above. In particular, model/__init__.py line 133 onwards became:
else:
for p in zip(*x_chops):
p = [p_.unsqueeze(0) for p_ in p]
y = self.forward_chop(*p, shave=shave, min_size=min_size)
I really don't have a good explanation but it seems to work.
I guess the gist of it is that x_chops contains a tensor for each input (args is List[Tensor(B x C x H x W)]) to forward_chop. That tensor is cut up into quarters and catted along the batch dimension. So now you have something along the lines of x_chops is List[Tensor(B*4 x C x H/4 x W/4)]. Then the "clever" line
for p in zip(*x_chops):
is equivalent to something like
for i in range(B*4):
p = [x_ch[i, ...] for x_ch in x_chops]
which, as you can see when it's not so "clever", is going to drop the first dimension on each element in x_chops. Which is a problem because p is the recursive input to forward_chop. :(
I solved this problem by omitting "--chop" in option.(python==36, pytorch==1.1)