-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
discriminator loss goes to infinity #29
Comments
hmm maybe 10k images is not enough are you doing any augmentations? |
No augmentations. |
Check the gradients and output values at every step of the way to see where it starts to go wrong. |
Thanks. It starts to jump dramatically already at step 500. |
@065294847 what is your batch size and learning rate? |
Just the defaults, batch size 4, lr 3e-4 |
Try it with a dataset that's known to work and converge. Just to rule it out. But try to log the grads and weights across steps to have a better understanding. |
Thanks, I'll experiment a bit and report back |
1 similar comment
Thanks, I'll experiment a bit and report back |
@065294847 try lowering your learning rate, or increase your effective batch size (either increase |
@065294847 get more data too, 10k is nothing... try 100k or a million. if that isn't possible, do some basic augmentations |
Yes of course, just wanted to do a quick test first :) |
Hi,
I'm trying to train the cvivit on a set of 10000 images. The vae-loss keeps going down, but the discriminator loss keeps rising infinity. It's easy to fool :)
Any idea what the problem is?
The text was updated successfully, but these errors were encountered: