pytorch-CycleGAN-and-pix2pix icon indicating copy to clipboard operation
pytorch-CycleGAN-and-pix2pix copied to clipboard

The test results of pix2pix are too bad.

Open inhyukpark2 opened this issue 2 years ago • 4 comments

I am trying to convert an rgb image to an infrared image style through pix2pix training.

I have found that it works very well when learning.

However, if the test is performed with the same model and same parameters, the results are very poor. What is the reason?

Please help me.

(first picture is original The second picture is the picture saved during the train The third picture is the actual test result)

epoch675_real_A epoch675_fake_B 101306_fake_B .

inhyukpark2 avatar Jul 31 '22 11:07 inhyukpark2

Could you share with us the training and test command lines? Did you use the same flags (e.g., -preprocess)?

junyanz avatar Oct 04 '22 21:10 junyanz

Another thing might be evaluating the model with eval() mode turned on and off (link). Could you run the test mode with and without the --eval option and see if that makes a difference?

taesungp avatar Oct 04 '22 21:10 taesungp

Another thing might be evaluating the model with eval() mode turned on and off (link). Could you run the test mode with and without the --eval option and see if that makes a difference?

I am also working on same project, to convert RGB to IR. I also have same observation as above. But in my case, running test without '--eval' does not do much good. My results are not as bad as above. But, there is certainly significant loss of detail in the fake images when running test. I have attached some images below as an example.

While training, the fake image (left) and real image (right) looks something like this. image

And during test the fake image (left) and real image (right) looks like this. image

Is there any other way for me to improve these results? I am currently using part of KAIST dataset.

ShubhamAbhayDeshpande avatar Mar 09 '23 15:03 ShubhamAbhayDeshpande

The model might overfit the training set. To prevent overfitting, you can either use a larger dataset or apply more aggressive augmentation (see the option --preprocess for more details.)

junyanz avatar Mar 14 '23 20:03 junyanz