Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
Haleshot committed Aug 16, 2024
1 parent bc37d67 commit f9aa102
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion easy/Log_Softmax/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ $$\text{softmax}(x_i) = \frac{e^{x_i}}{\sum_{j=1}^n e^{x_j}}$$
However, directly applying the logarithm to the softmax function can lead to numerical instability, especially when dealing with large numbers. To prevent this, we use the log-softmax function, which incorporates a shift by subtracting the maximum value from the input vector:

$$
\text{log-softmax}(x_i) = x_i - \max(x) - \log\left(\sum_{j=1}^n e^{x_j - \max(x)}\right)
\text{log\_softmax}(x_i) = x_i - \max(x) - \log\left(\sum_{j=1}^n e^{x_j - \max(x)}\right)
$$

This formulation helps to avoid overflow issues that can occur when exponentiating large numbers. The log-softmax function is particularly useful in machine learning for calculating probabilities in a stable manner, especially when used with cross-entropy loss functions.
Expand Down

0 comments on commit f9aa102

Please sign in to comment.