AnimeStylized icon indicating copy to clipboard operation
AnimeStylized copied to clipboard

White-Box model results are not good

Open yxt132 opened this issue 5 years ago • 14 comments

Great work in re-implementing the white-box model using pytorch. However, after testing, I found the results are not good as the official version by the authors. There is still a large gap. What do you think could be the reason? Do we need the train the model longer or something else?

yxt132 avatar Nov 10 '20 06:11 yxt132

Yes, I also think the white-box model not as good as the official version.

  1. I found the re-implemented model predicts the result has a wild color(you can see right-top area). The original repo has the same problem. animegan_test2_out

  2. re-implemented model predicts anime-style is not obvious enough.

When I re-implementing, I spend a lot of time ensuring the color shift gudied_filter will have the same behavior. I think the training step is the same as the official version.

Now I try to use the same hyperparameters training both steps in the official version and my version. This model training costs a lot of time in superpixel, So I need some time to test.

If you can help find which code has a problem, I will be very grateful.

zhen8838 avatar Nov 10 '20 08:11 zhen8838

thanks for your quick response! I have not started the training yet. I will let you know if I figure out something. By the way, which superpixel method did you use in your training. I wonder how much impact the superpixel method has on the results.

yxt132 avatar Nov 10 '20 15:11 yxt132

My default superpixel method during training is consistent with the one mentioned in the author’s paper. He uses the superpixel method of adaptive brightness to increase the brightness of the output image, and the parameters I use are consistent with the official code, sigma=1.2 seg_num=200, you can check my config file.

zhen8838 avatar Nov 10 '20 16:11 zhen8838

official code use selective_adacolor superpixel method training 15999 steps results: 15999_face_photo 15999_face_result 15999_scenery_photo 15999_scenery_result

zhen8838 avatar Nov 12 '20 13:11 zhen8838

Not bad. I noticed some strange color in the generated pictures though.

image

how is the pytorch version's training? any progress?

yxt132 avatar Nov 13 '20 03:11 yxt132

test images official code pytorch version
actress2 actress2 actress2_out
china6 china6 china6_out
food6 food6 food6_out
food16 food16 food16_out
liuyifei4 liuyifei4 liuyifei4_out
london1 london1 london1_out
mountain4 mountain4 mountain4_out
mountain5 mountain5 mountain5_out
national_park1 national_park1 national_park1_out
party5 party5 party5_out
party7 party7 party7_out

zhen8838 avatar Nov 13 '20 04:11 zhen8838

Well done! It seems the pytorch version's results are smoother than the official tf version. What changes did you make in the pytorch version? I acutally like the pytorch version's results better. Can you update your repo and release the updated trained weights? Again, great work!

yxt132 avatar Nov 13 '20 06:11 yxt132

Hi. I add new weights in google drive, you can find in the readme. I also upload tensorflow version weights named whitebox-tf.zip.

zhen8838 avatar Nov 14 '20 02:11 zhen8838

I found the strange color caused by guided filter, but now I didn't find a better method to solve it.

zhen8838 avatar Nov 14 '20 02:11 zhen8838

the author said you could train without guided filter and add guided filter during inference.

yxt132 avatar Nov 14 '20 02:11 yxt132

ok. I will try if time permits

zhen8838 avatar Nov 14 '20 02:11 zhen8838

One thing that you could try with the colors is to use Color Transfer algorithm, like this from PyImageSearch.

Also, regarding cartoon noise, the guided filter should help on the post-process, by using lower values of epsilon (ε), like in WhiteBox's cartoonize.py.

GustavoStahl avatar Nov 17 '20 15:11 GustavoStahl

@GustavoStahl thanks, I missing the test_code ε value is not equal to train_code ε. But I don't think the color transfer algorithm is needed. For this model, it needs to keep the original color as much as possible and only increase the brightness. Regarding the degree of texture, I think it can be adjusted g_gray_weight. like this.

zhen8838 avatar Nov 18 '20 02:11 zhen8838

I found the strange color caused by guided filter, but now I didn't find a better method to solve it.

adding np.clipping() after guided filter would solve the artifacts

huangfuyang avatar Jun 03 '21 09:06 huangfuyang