From f9a67784b02a62a41aa68fc485b0f13d505d3c17 Mon Sep 17 00:00:00 2001 From: Daniel Golden Date: Wed, 3 Sep 2014 13:18:48 -0700 Subject: [PATCH] Correct reference to lenet_train_test.prototxt Not lenet.prototxt --- examples/mnist/readme.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/mnist/readme.md b/examples/mnist/readme.md index a51fc1c0d3b..13354916ab8 100644 --- a/examples/mnist/readme.md +++ b/examples/mnist/readme.md @@ -26,7 +26,7 @@ If it complains that `wget` or `gunzip` are not installed, you need to install t Before we actually run the training program, let's explain what will happen. We will use the [LeNet](http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf) network, which is known to work well on digit classification tasks. We will use a slightly different version from the original LeNet implementation, replacing the sigmoid activations with Rectified Linear Unit (ReLU) activations for the neurons. -The design of LeNet contains the essence of CNNs that are still used in larger models such as the ones in ImageNet. In general, it consists of a convolutional layer followed by a pooling layer, another convolution layer followed by a pooling layer, and then two fully connected layers similar to the conventional multilayer perceptrons. We have defined the layers in `$CAFFE_ROOT/examples/mnist/lenet.prototxt`. +The design of LeNet contains the essence of CNNs that are still used in larger models such as the ones in ImageNet. In general, it consists of a convolutional layer followed by a pooling layer, another convolution layer followed by a pooling layer, and then two fully connected layers similar to the conventional multilayer perceptrons. We have defined the layers in `$CAFFE_ROOT/examples/mnist/lenet_train_test.prototxt`. ## Define the MNIST Network