pytorch_convNd
pytorch_convNd copied to clipboard
Why is transposed convolution faster ?
This is the result of 2D convolution
ConvNd time: 6209.3720703125
ConvGT time: 13.088768005371094
convND error: 3.196690840923111e-07 %
ConvTransposeNd time: 32.824127197265625
ConvTransposeGT time: 21.486591339111328
convTransposeND error: 4.2337585881568884e-08 %
and this is the result of 5D convolution
ConvNd time: 1249.503173828125
torch.Size([1, 1, 2, 2, 2, 2, 2])
ConvTransposeNd time: 39.06355285644531
torch.Size([1, 1, 11, 11, 11, 11, 11])
I wonder why the transposed convolution is comparable to built-in pytorch method but convolution is much slower?