Pytorch-Attention-Guided-CycleGAN
Pytorch-Attention-Guided-CycleGAN copied to clipboard
Expected input channels mismatch
I get this error whenever I try to train the network using "train.py"
Traceback (most recent call last):
File ".\train.py", line 237, in <module>
all()
File ".\train.py", line 141, in all
attnMapA = toZeroThreshold(AttnA(realA))
File "C:\Users\User1\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "C:\Users\User1\Documents\Pytorch-Attention-Guided-CycleGAN\models.py", line 141, in forward
return self.model(x)
File "C:\Users\User1\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "C:\Users\User1\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\container.py", line 92, in forward
input = module(input)
File "C:\Users\User1\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "C:\Users\User1\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\conv.py", line 320, in forward
self.padding, self.dilation, self.groups)
RuntimeError: Given groups=1, weight of size [32, 3, 7, 7], expected input[1, 4, 256, 256] to have 3 channels, but got 4 channels instead
It's also important to mention that i had to slightly modify the code for windows multiprocessing compatibility. I created a all() function that includes the whole code and gets run like this as stated in the Pytorch windows FAQ:
if __name__ == '__main__':
all()