Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about the cross entropy #3

Open
phoenixzxj opened this issue Nov 12, 2018 · 0 comments
Open

question about the cross entropy #3

phoenixzxj opened this issue Nov 12, 2018 · 0 comments

Comments

@phoenixzxj
Copy link

In the post "Recurrent Neural Networks in Tensorflow I"
it writes that
"If the network learns no dependencies, it will correctly assign a probability of 62.5% to 1, for an expected cross-entropy loss of about 0.66.If the network learns only the first dependency (3 steps back) but not the second dependency, it will correctly assign a probability of 87.5%, 50% of the time, and correctly assign a probability of 62.5% the other 50% of the time, for an expected cross entropy loss of about 0.52.If the network learns both dependencies, it will be 100% accurate 25% of the time, correctly assign a probability of 50%, 25% of the time, and correctly assign a probability of 75%, 50% of the time, for an expected cross extropy loss of about 0.45.".
How to get that probability “0.875, 0.625”? As the sample has been devided into x(t-3) is 1 and not 1, is it right to do so ?
The cross entropy should be the sum of real * log(expected). There is no model now, so the probability of correctly assigning 1gives to it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant