InvertibleGrayscale
InvertibleGrayscale copied to clipboard
Shape mismatch in tuple component 0. Expected [256,256,3], got [480,640,3] [[{{node input/batch/fifo_queue_enqueue}}]] upon testing dataset and also throwing this exception too.
This network does not feed input rather than 256x256 of image size and throwing exception
However you clearly mention in your research paper It is applicable for arbitrary images in testing phase rather choice of training. Mentioned below in research paper All input images are cropped and resized to 256×256 resolution during training, but images of arbitrary resolutions can be processed during the testing.
To test on images of other resolutions, you have to modify the hard-coded image resolution variables in './model.py' (line:7~8). Btw, the image width and height have to be multiple of 4, because two x2 downsampling modules are employed in the encoder.