Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
Haleshot committed Aug 16, 2024
1 parent a048d72 commit a67cb4c
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion easy/Log_Softmax/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,9 @@ $$\text{softmax}(x_i) = \frac{e^{x_i}}{\sum_{j=1}^n e^{x_j}}$$

However, directly applying the logarithm to the softmax function can lead to numerical instability, especially when dealing with large numbers. To prevent this, we use the log-softmax function, which incorporates a shift by subtracting the maximum value from the input vector:

`\text{log_softmax}(x_i) = x_i - \max(x) - \log\left(\sum_{j=1}^n e^{x_j - \max(x)}\right)`
$$
\text{log\_softmax}(x_i) = x_i - \max(x) - \log\left(\sum_{j=1}^n e^{x_j - \max(x)}\right)
$$

This formulation helps to avoid overflow issues that can occur when exponentiating large numbers. The log-softmax function is particularly useful in machine learning for calculating probabilities in a stable manner, especially when used with cross-entropy loss functions.

Expand Down

0 comments on commit a67cb4c

Please sign in to comment.