You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for your repo, it helped me a lot, but there are only two small doubts
In the backpropagation implementation of the BN layer, is the derivative of Xmean missing one item, because self.stddev_inv also contains mean
In the implementation of dropout, is the meaning of p different from p in the original dropout? The p in the original dropout is the proportion of the inactivated unit. It seems that it is not here, and the signal strength after the dropout is Isn’t it the same as before?
The text was updated successfully, but these errors were encountered:
zy23456
changed the title
Back propagation at the BN layer
BN layer and dropout
Jun 6, 2020
Thank you very much for your repo, it helped me a lot, but there are only two small doubts
The text was updated successfully, but these errors were encountered: