We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 7cade8c commit bb2aa41Copy full SHA for bb2aa41
doc/mlp.txt
@@ -90,8 +90,8 @@ The set of parameters to learn is the set :math:`\theta =
90
\{W^{(2)},b^{(2)},W^{(1)},b^{(1)}\}`. Obtaining the gradients
91
:math:`\partial{\ell}/\partial{\theta}` can be achieved through the
92
**backpropagation algorithm** (a special case of the chain-rule of derivation).
93
-Thankfully, since Theano performs automatic differentation, we will not need to
94
-cover this in the tutorial !
+Thankfully, since Theano performs automatic differentiation, we will not need to
+cover this in the tutorial!
95
96
97
Going from logistic regression to MLP
0 commit comments