Skip to content

Commit d0c8197

Browse files
Merge branch 'master' of github.com:scotthuang1989/my_tensorflow_example_code
2 parents 96e7165 + b962ef1 commit d0c8197

File tree

1 file changed

+297
-0
lines changed

1 file changed

+297
-0
lines changed
+297
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,297 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"A replica of this keras blog [Reference](https://blog.keras.io/keras-as-a-simplified-interface-to-tensorflow-tutorial.html)\n",
8+
"\n"
9+
]
10+
},
11+
{
12+
"cell_type": "code",
13+
"execution_count": null,
14+
"metadata": {},
15+
"outputs": [],
16+
"source": []
17+
},
18+
{
19+
"cell_type": "markdown",
20+
"metadata": {},
21+
"source": [
22+
"We should start by creating a TensorFlow session and registering it with Keras. This means that Keras will use the session we registered to initialize all variables that it creates internally."
23+
]
24+
},
25+
{
26+
"cell_type": "code",
27+
"execution_count": null,
28+
"metadata": {},
29+
"outputs": [],
30+
"source": [
31+
"import tensorflow as tf\n",
32+
"sess = tf.Session()\n",
33+
"\n",
34+
"from keras import backend as K\n",
35+
"K.set_session(sess)"
36+
]
37+
},
38+
{
39+
"cell_type": "markdown",
40+
"metadata": {},
41+
"source": [
42+
"Now let's get started with our MNIST model. We can start building a classifier exactly as you would do in TensorFlow:\n",
43+
"\n"
44+
]
45+
},
46+
{
47+
"cell_type": "code",
48+
"execution_count": null,
49+
"metadata": {},
50+
"outputs": [],
51+
"source": [
52+
"# this placeholder will contain our input digits, as flat vectors\n",
53+
"img = tf.placeholder(tf.float32, shape=(None, 784))"
54+
]
55+
},
56+
{
57+
"cell_type": "markdown",
58+
"metadata": {},
59+
"source": [
60+
"We can then use Keras layers to speed up the model definition process:\n",
61+
"\n"
62+
]
63+
},
64+
{
65+
"cell_type": "code",
66+
"execution_count": null,
67+
"metadata": {},
68+
"outputs": [],
69+
"source": [
70+
"from keras.layers import Dense\n",
71+
"\n",
72+
"# Keras layers can be called on TensorFlow tensors:\n",
73+
"x = Dense(128, activation='relu')(img) # fully-connected layer with 128 units and ReLU activation\n",
74+
"x = Dense(128, activation='relu')(x)\n",
75+
"preds = Dense(10, activation='softmax')(x) # output layer with 10 units and a softmax activation"
76+
]
77+
},
78+
{
79+
"cell_type": "markdown",
80+
"metadata": {},
81+
"source": [
82+
"We define the placeholder for the labels, and the loss function we will use:\n",
83+
"\n"
84+
]
85+
},
86+
{
87+
"cell_type": "code",
88+
"execution_count": null,
89+
"metadata": {},
90+
"outputs": [],
91+
"source": [
92+
"labels = tf.placeholder(tf.float32, shape=(None, 10))\n",
93+
"\n",
94+
"from keras.objectives import categorical_crossentropy\n",
95+
"loss = tf.reduce_mean(categorical_crossentropy(labels, preds))"
96+
]
97+
},
98+
{
99+
"cell_type": "markdown",
100+
"metadata": {},
101+
"source": [
102+
"Let's train the model with a TensorFlow optimizer:\n",
103+
"\n"
104+
]
105+
},
106+
{
107+
"cell_type": "code",
108+
"execution_count": null,
109+
"metadata": {},
110+
"outputs": [],
111+
"source": [
112+
"from tensorflow.examples.tutorials.mnist import input_data\n",
113+
"mnist_data = input_data.read_data_sets('MNIST_data', one_hot=True)\n",
114+
"\n",
115+
"train_step = tf.train.GradientDescentOptimizer(0.5).minimize(loss)\n",
116+
"\n",
117+
"# Initialize all variables\n",
118+
"init_op = tf.global_variables_initializer()\n",
119+
"sess.run(init_op)\n",
120+
"\n",
121+
"# Run training loop\n",
122+
"with sess.as_default():\n",
123+
" for i in range(100):\n",
124+
" batch = mnist_data.train.next_batch(50)\n",
125+
" train_step.run(feed_dict={img: batch[0],\n",
126+
" labels: batch[1]})"
127+
]
128+
},
129+
{
130+
"cell_type": "markdown",
131+
"metadata": {},
132+
"source": [
133+
"We can now evaluate the model:\n",
134+
"\n"
135+
]
136+
},
137+
{
138+
"cell_type": "code",
139+
"execution_count": null,
140+
"metadata": {},
141+
"outputs": [],
142+
"source": [
143+
"from keras.metrics import categorical_accuracy as accuracy\n",
144+
"\n",
145+
"acc_value = accuracy(labels, preds)\n",
146+
"with sess.as_default():\n",
147+
" print(acc_value.eval(feed_dict={img: mnist_data.test.images,\n",
148+
" labels: mnist_data.test.labels}))"
149+
]
150+
},
151+
{
152+
"cell_type": "markdown",
153+
"metadata": {},
154+
"source": [
155+
"## Different behaviors during training and testing\n"
156+
]
157+
},
158+
{
159+
"cell_type": "markdown",
160+
"metadata": {},
161+
"source": [
162+
"Some Keras layers (e.g. Dropout, BatchNormalization) behave differently at training time and testing time. You can tell whether a layer uses the \"learning phase\" (train/test) by printing layer.uses_learning_phase, a boolean: \n",
163+
"* True if the layer has a different behavior in training mode and test mode\n",
164+
"* False otherwise.\n",
165+
"\n",
166+
"If your model includes such layers, then you need to specify the value of the learning phase as part of feed_dict, so that your model knows whether to apply dropout/etc or not.\n",
167+
"\n",
168+
"The Keras learning phase (a scalar TensorFlow tensor) is accessible via the Keras backend:"
169+
]
170+
},
171+
{
172+
"cell_type": "code",
173+
"execution_count": null,
174+
"metadata": {},
175+
"outputs": [],
176+
"source": [
177+
"from keras import backend as K\n",
178+
"print(K.learning_phase())"
179+
]
180+
},
181+
{
182+
"cell_type": "markdown",
183+
"metadata": {},
184+
"source": [
185+
"To make use of the learning phase, simply pass the value \"1\" (training mode) or \"0\" (test mode) to feed_dict:"
186+
]
187+
},
188+
{
189+
"cell_type": "code",
190+
"execution_count": null,
191+
"metadata": {},
192+
"outputs": [],
193+
"source": [
194+
"# train mode\n",
195+
"with sess.as_default():\n",
196+
" train_step.run(feed_dict={x: batch[0], labels: batch[1], K.learning_phase(): 1})"
197+
]
198+
},
199+
{
200+
"cell_type": "markdown",
201+
"metadata": {},
202+
"source": [
203+
"For instance, here's how to add Dropout layers to our previous MNIST example:\n",
204+
"\n"
205+
]
206+
},
207+
{
208+
"cell_type": "code",
209+
"execution_count": 11,
210+
"metadata": {},
211+
"outputs": [
212+
{
213+
"name": "stdout",
214+
"output_type": "stream",
215+
"text": [
216+
"Extracting MNIST_data\\train-images-idx3-ubyte.gz\n",
217+
"Extracting MNIST_data\\train-labels-idx1-ubyte.gz\n",
218+
"Extracting MNIST_data\\t10k-images-idx3-ubyte.gz\n",
219+
"Extracting MNIST_data\\t10k-labels-idx1-ubyte.gz\n",
220+
"[1. 1. 1. ... 0. 1. 1.]\n"
221+
]
222+
}
223+
],
224+
"source": [
225+
"from keras.layers import Dropout, Dense\n",
226+
"from keras import backend as K\n",
227+
"from keras.objectives import categorical_crossentropy\n",
228+
"import tensorflow as tf\n",
229+
"from keras import backend as K\n",
230+
"from tensorflow.examples.tutorials.mnist import input_data\n",
231+
"from keras.metrics import categorical_accuracy as accuracy\n",
232+
"\n",
233+
"sess = tf.Session()\n",
234+
"K.set_session(sess)\n",
235+
"\n",
236+
"img = tf.placeholder(tf.float32, shape=(None, 784))\n",
237+
"labels = tf.placeholder(tf.float32, shape=(None, 10))\n",
238+
"\n",
239+
"x = Dense(128, activation='relu')(img)\n",
240+
"x = Dropout(0.5)(x)\n",
241+
"x = Dense(128, activation='relu')(x)\n",
242+
"x = Dropout(0.5)(x)\n",
243+
"preds = Dense(10, activation='softmax')(x)\n",
244+
"\n",
245+
"loss = tf.reduce_mean(categorical_crossentropy(labels, preds))\n",
246+
"\n",
247+
"mnist_data = input_data.read_data_sets('MNIST_data', one_hot=True)\n",
248+
"\n",
249+
"train_step = tf.train.GradientDescentOptimizer(0.5).minimize(loss)\n",
250+
"\n",
251+
"# Initialize all variables\n",
252+
"init_op = tf.global_variables_initializer()\n",
253+
"sess.run(init_op)\n",
254+
"with sess.as_default():\n",
255+
" for i in range(100):\n",
256+
" batch = mnist_data.train.next_batch(50)\n",
257+
" train_step.run(feed_dict={img: batch[0],\n",
258+
" labels: batch[1],\n",
259+
" K.learning_phase(): 1})\n",
260+
"\n",
261+
"acc_value = accuracy(labels, preds)\n",
262+
"with sess.as_default():\n",
263+
" print(acc_value.eval(feed_dict={img: mnist_data.test.images,\n",
264+
" labels: mnist_data.test.labels,\n",
265+
" K.learning_phase():0}))"
266+
]
267+
},
268+
{
269+
"cell_type": "code",
270+
"execution_count": null,
271+
"metadata": {},
272+
"outputs": [],
273+
"source": []
274+
}
275+
],
276+
"metadata": {
277+
"kernelspec": {
278+
"display_name": "Python 3",
279+
"language": "python",
280+
"name": "python3"
281+
},
282+
"language_info": {
283+
"codemirror_mode": {
284+
"name": "ipython",
285+
"version": 3
286+
},
287+
"file_extension": ".py",
288+
"mimetype": "text/x-python",
289+
"name": "python",
290+
"nbconvert_exporter": "python",
291+
"pygments_lexer": "ipython3",
292+
"version": "3.6.3"
293+
}
294+
},
295+
"nbformat": 4,
296+
"nbformat_minor": 2
297+
}

0 commit comments

Comments
 (0)