YeNet-Pytorch icon indicating copy to clipboard operation
YeNet-Pytorch copied to clipboard

use_batch_norm

Open weizequan opened this issue 8 years ago • 1 comments

in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?

In your implementation, the image don't be normalized to [0.0,1.0]?

weizequan avatar Dec 08 '17 14:12 weizequan

Hi,

in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?

First of all the original publication doesn't use batch normalization. If you want to use it, you have to avoid to use the cover + stego of the same cover within the same batch, otherwise the network will try to use the average + variance to cheat. Basically you can't use batch normalization if you have dependencies within the batch.

In you implementation, the image don't be normalized to [0.0, 1.0] ?

if you look at the preprocessing part, it's made for [0, 255] images (the output will naturally be very close to a [-1. 1.] distribution).

Caenorst avatar Dec 09 '17 11:12 Caenorst