You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?
First of all the original publication doesn't use batch normalization. If you want to use it, you have to avoid to use the cover + stego of the same cover within the same batch, otherwise the network will try to use the average + variance to cheat. Basically you can't use batch normalization if you have dependencies within the batch.
In you implementation, the image don't be normalized to [0.0, 1.0] ?
if you look at the preprocessing part, it's made for [0, 255] images (the output will naturally be very close to a [-1. 1.] distribution).
in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?
In your implementation, the image don't be normalized to [0.0,1.0]?
The text was updated successfully, but these errors were encountered: