Skip to content

Commit

Permalink
Merge pull request tensorflow#13056 from maspwr/mnist-doc-1
Browse files Browse the repository at this point in the history
Update mnist beginners softmax variables
  • Loading branch information
caisq authored Sep 19, 2017
2 parents 1726550 + bc1d426 commit 5747ec3
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions tensorflow/docs_src/get_started/mnist/beginners.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,11 +180,11 @@ You can think of it as converting tallies
of evidence into probabilities of our input being in each class.
It's defined as:

$$\text{softmax}(x) = \text{normalize}(\exp(x))$$
$$\text{softmax}(evidence) = \text{normalize}(\exp(evidence))$$

If you expand that equation out, you get:

$$\text{softmax}(x)_i = \frac{\exp(x_i)}{\sum_j \exp(x_j)}$$
$$\text{softmax}(evidence)_i = \frac{\exp(evidence_i)}{\sum_j \exp(evidence_j)}$$

But it's often more helpful to think of softmax the first way: exponentiating
its inputs and then normalizing them. The exponentiation means that one more
Expand Down

0 comments on commit 5747ec3

Please sign in to comment.