faststyle icon indicating copy to clipboard operation
faststyle copied to clipboard

Strange results from training

Open kthordarson opened this issue 6 years ago • 1 comments

Hello, When using slow style, I get really nice results after 100-200 iterations but if I train a model with the same style image I never get results that look anything similar to slow style. Even after 40k iterations my pictures look like random garbage using only the colors from the style image and nothing like the content image. How can I get similar results by training a model?

kthordarson avatar Apr 25 '18 04:04 kthordarson

Did the starry night example train properly? Keep in mind that the slow_style and the model produced by train won't be the same. slow_style produces the best results but is slow. It's useful for prototyping a bit before committing to the 8 hour wait time for training.

Also, note this repo is pretty old and there's been a lot of work in the area where for example you can pass an arbitrary style image at test-time, and GANs that can perform stylization.

If you're certain you're feeding your reference style image in and training correctly, you can try playing with the various hyperparameters exposed in the python arguments (ex: the weight per vgg layer in the perceptual loss, or the relative weight between content and style)

ghwatson avatar Jan 19 '19 17:01 ghwatson