Skip to content

Commit

Permalink
Make derivative of ReLU consistent with TensorFlow (der should be 0 w…
Browse files Browse the repository at this point in the history
…hen x is 0).
  • Loading branch information
dsmilkov committed May 15, 2016
1 parent 844e3f2 commit 718a6c8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion nn.ts
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ export class Activations {
};
public static RELU: ActivationFunction = {
output: x => Math.max(0, x),
der: x => x < 0 ? 0 : 1
der: x => x <= 0 ? 0 : 1
};
public static SIGMOID: ActivationFunction = {
output: x => 1 / (1 + Math.exp(-x)),
Expand Down

0 comments on commit 718a6c8

Please sign in to comment.