Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use_batch_norm #2

Open
weizequan opened this issue Dec 8, 2017 · 1 comment
Open

use_batch_norm #2

weizequan opened this issue Dec 8, 2017 · 1 comment

Comments

@weizequan
Copy link

in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?

In your implementation, the image don't be normalized to [0.0,1.0]?

@Caenorst
Copy link
Owner

Caenorst commented Dec 9, 2017

Hi,

in train_loader, pair_constraint=not(args.use_batch_norm), Why set pair_constraint as False when use batch normalization?

First of all the original publication doesn't use batch normalization. If you want to use it, you have to avoid to use the cover + stego of the same cover within the same batch, otherwise the network will try to use the average + variance to cheat. Basically you can't use batch normalization if you have dependencies within the batch.

In you implementation, the image don't be normalized to [0.0, 1.0] ?

if you look at the preprocessing part, it's made for [0, 255] images (the output will naturally be very close to a [-1. 1.] distribution).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants