Dmitry Ulyanov
Dmitry Ulyanov
Hi, yes, it looks like a mistake. Thanks for spotting it. I will also change `sum` to `mean` in the paper, thank again! Best, Dmitry
I will fix it in in several days, when I will have time to make sure everything still works.
KL divergence in fact will not be zero in the perfect case, but when KL is minimal Q ~ uniform on sphere, that is what we want.
Hi, did you try `eval` for instance norm?
One possibility is to modify `eval` function https://github.com/DmitryUlyanov/texture_nets/blob/master/InstanceNormalization.lua#L94 such that batchnorm is set to eval mode. For a dirty try just add `self.bn.evaluate()` in https://github.com/DmitryUlyanov/texture_nets/blob/master/InstanceNormalization.lua#L48 Using this `eval` mode you...
Hi, being a hack it does not work for every style image, but helps sometimes.
The file `convert-fast-neural-style.py` does not belong to this repo, so I cannot help you...
It seems you need to have `instance norm` defined in `nn/Legacy` in pytorch. I haven't ever tried to load torch models to pytorch, so I don't know how to do...
Well, it finishes...
Hi, in this repo only instance normalization is implemented out of all contributions of the paper. I will try to clean up the code for diversity part and make it...