You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the post "Recurrent Neural Networks in Tensorflow I"
it writes that
"If the network learns no dependencies, it will correctly assign a probability of 62.5% to 1, for an expected cross-entropy loss of about 0.66.If the network learns only the first dependency (3 steps back) but not the second dependency, it will correctly assign a probability of 87.5%, 50% of the time, and correctly assign a probability of 62.5% the other 50% of the time, for an expected cross entropy loss of about 0.52.If the network learns both dependencies, it will be 100% accurate 25% of the time, correctly assign a probability of 50%, 25% of the time, and correctly assign a probability of 75%, 50% of the time, for an expected cross extropy loss of about 0.45.".
How to get that probability “0.875, 0.625”? As the sample has been devided into x(t-3) is 1 and not 1, is it right to do so ?
The cross entropy should be the sum of real * log(expected). There is no model now, so the probability of correctly assigning 1gives to it?
The text was updated successfully, but these errors were encountered:
In the post "Recurrent Neural Networks in Tensorflow I"
it writes that
"If the network learns no dependencies, it will correctly assign a probability of 62.5% to 1, for an expected cross-entropy loss of about 0.66.If the network learns only the first dependency (3 steps back) but not the second dependency, it will correctly assign a probability of 87.5%, 50% of the time, and correctly assign a probability of 62.5% the other 50% of the time, for an expected cross entropy loss of about 0.52.If the network learns both dependencies, it will be 100% accurate 25% of the time, correctly assign a probability of 50%, 25% of the time, and correctly assign a probability of 75%, 50% of the time, for an expected cross extropy loss of about 0.45.".
How to get that probability “0.875, 0.625”? As the sample has been devided into x(t-3) is 1 and not 1, is it right to do so ?
The cross entropy should be the sum of real * log(expected). There is no model now, so the probability of correctly assigning 1gives to it?
The text was updated successfully, but these errors were encountered: