context_encoder_pytorch
context_encoder_pytorch copied to clipboard
Why the size of real_center is[64,3,128,128]
I think the size of real_center should be [64,3,64,64],but after ran it,I noticed that the size is [64,3,128,128]