Output:
(base) raschka@lambda-quad:~/code/stat453-ss21-exp$ python simple_cnn.py
PyTorch version: 1.7.0
Using cuda:0
[W Context.cpp:69] Warning: torch.set_deterministic is in beta, and its design and functionality may change in the future. (function operator())
Epoch: 001/010 | Batch 0000/0422 | Loss: 2.2935
Epoch: 001/010 | Batch 0050/0422 | Loss: 0.5462
Epoch: 001/010 | Batch 0100/0422 | Loss: 0.3154
Epoch: 001/010 | Batch 0150/0422 | Loss: 0.2551
Epoch: 001/010 | Batch 0200/0422 | Loss: 0.1792
Epoch: 001/010 | Batch 0250/0422 | Loss: 0.2210
Epoch: 001/010 | Batch 0300/0422 | Loss: 0.1551
Epoch: 001/010 | Batch 0350/0422 | Loss: 0.2155
Epoch: 001/010 | Batch 0400/0422 | Loss: 0.2306
Epoch: 001/010 | Train: 96.72% | Validation: 97.15%
Time elapsed: 0.09 min
Epoch: 002/010 | Batch 0000/0422 | Loss: 0.1028
Epoch: 002/010 | Batch 0050/0422 | Loss: 0.1167
Epoch: 002/010 | Batch 0100/0422 | Loss: 0.0660
Epoch: 002/010 | Batch 0150/0422 | Loss: 0.1024
Epoch: 002/010 | Batch 0200/0422 | Loss: 0.0847
Epoch: 002/010 | Batch 0250/0422 | Loss: 0.0905
Epoch: 002/010 | Batch 0300/0422 | Loss: 0.1024
Epoch: 002/010 | Batch 0350/0422 | Loss: 0.0719
Epoch: 002/010 | Batch 0400/0422 | Loss: 0.1302
Epoch: 002/010 | Train: 98.12% | Validation: 98.03%
Time elapsed: 0.18 min
Epoch: 003/010 | Batch 0000/0422 | Loss: 0.0720
Epoch: 003/010 | Batch 0050/0422 | Loss: 0.0984
Epoch: 003/010 | Batch 0100/0422 | Loss: 0.0373
Epoch: 003/010 | Batch 0150/0422 | Loss: 0.0685
Epoch: 003/010 | Batch 0200/0422 | Loss: 0.0511
Epoch: 003/010 | Batch 0250/0422 | Loss: 0.0617
Epoch: 003/010 | Batch 0300/0422 | Loss: 0.0748
Epoch: 003/010 | Batch 0350/0422 | Loss: 0.0425
Epoch: 003/010 | Batch 0400/0422 | Loss: 0.1077
Epoch: 003/010 | Train: 98.71% | Validation: 98.37%
Time elapsed: 0.27 min
Epoch: 004/010 | Batch 0000/0422 | Loss: 0.0547
Epoch: 004/010 | Batch 0050/0422 | Loss: 0.0839
Epoch: 004/010 | Batch 0100/0422 | Loss: 0.0249
Epoch: 004/010 | Batch 0150/0422 | Loss: 0.0443
Epoch: 004/010 | Batch 0200/0422 | Loss: 0.0452
Epoch: 004/010 | Batch 0250/0422 | Loss: 0.0413
Epoch: 004/010 | Batch 0300/0422 | Loss: 0.0563
Epoch: 004/010 | Batch 0350/0422 | Loss: 0.0298
Epoch: 004/010 | Batch 0400/0422 | Loss: 0.0999
Epoch: 004/010 | Train: 98.97% | Validation: 98.37%
Time elapsed: 0.36 min
Epoch: 005/010 | Batch 0000/0422 | Loss: 0.0443
Epoch: 005/010 | Batch 0050/0422 | Loss: 0.0711
Epoch: 005/010 | Batch 0100/0422 | Loss: 0.0258
Epoch: 005/010 | Batch 0150/0422 | Loss: 0.0362
Epoch: 005/010 | Batch 0200/0422 | Loss: 0.0346
Epoch: 005/010 | Batch 0250/0422 | Loss: 0.0330
Epoch: 005/010 | Batch 0300/0422 | Loss: 0.0366
Epoch: 005/010 | Batch 0350/0422 | Loss: 0.0238
Epoch: 005/010 | Batch 0400/0422 | Loss: 0.0839
Epoch: 005/010 | Train: 99.15% | Validation: 98.42%
Time elapsed: 0.44 min
Epoch: 006/010 | Batch 0000/0422 | Loss: 0.0283
Epoch: 006/010 | Batch 0050/0422 | Loss: 0.0494
Epoch: 006/010 | Batch 0100/0422 | Loss: 0.0235
Epoch: 006/010 | Batch 0150/0422 | Loss: 0.0250
Epoch: 006/010 | Batch 0200/0422 | Loss: 0.0249
Epoch: 006/010 | Batch 0250/0422 | Loss: 0.0245
Epoch: 006/010 | Batch 0300/0422 | Loss: 0.0287
Epoch: 006/010 | Batch 0350/0422 | Loss: 0.0173
Epoch: 006/010 | Batch 0400/0422 | Loss: 0.0473
Epoch: 006/010 | Train: 99.33% | Validation: 98.65%
Time elapsed: 0.53 min
Epoch: 007/010 | Batch 0000/0422 | Loss: 0.0227
Epoch: 007/010 | Batch 0050/0422 | Loss: 0.0279
Epoch: 007/010 | Batch 0100/0422 | Loss: 0.0267
Epoch: 007/010 | Batch 0150/0422 | Loss: 0.0123
Epoch: 007/010 | Batch 0200/0422 | Loss: 0.0185
Epoch: 007/010 | Batch 0250/0422 | Loss: 0.0124
Epoch: 007/010 | Batch 0300/0422 | Loss: 0.0234
Epoch: 007/010 | Batch 0350/0422 | Loss: 0.0142
Epoch: 007/010 | Batch 0400/0422 | Loss: 0.0219
Epoch: 007/010 | Train: 99.48% | Validation: 98.80%
Time elapsed: 0.62 min
Epoch: 008/010 | Batch 0000/0422 | Loss: 0.0131
Epoch: 008/010 | Batch 0050/0422 | Loss: 0.0101
Epoch: 008/010 | Batch 0100/0422 | Loss: 0.0275
Epoch: 008/010 | Batch 0150/0422 | Loss: 0.0058
Epoch: 008/010 | Batch 0200/0422 | Loss: 0.0120
Epoch: 008/010 | Batch 0250/0422 | Loss: 0.0077
Epoch: 008/010 | Batch 0300/0422 | Loss: 0.0172
Epoch: 008/010 | Batch 0350/0422 | Loss: 0.0089
Epoch: 008/010 | Batch 0400/0422 | Loss: 0.0161
Epoch: 008/010 | Train: 99.53% | Validation: 98.87%
Time elapsed: 0.71 min
Epoch: 009/010 | Batch 0000/0422 | Loss: 0.0110
Epoch: 009/010 | Batch 0050/0422 | Loss: 0.0083
Epoch: 009/010 | Batch 0100/0422 | Loss: 0.0286
Epoch: 009/010 | Batch 0150/0422 | Loss: 0.0072
Epoch: 009/010 | Batch 0200/0422 | Loss: 0.0118
Epoch: 009/010 | Batch 0250/0422 | Loss: 0.0147
Epoch: 009/010 | Batch 0300/0422 | Loss: 0.0149
Epoch: 009/010 | Batch 0350/0422 | Loss: 0.0021
Epoch: 009/010 | Batch 0400/0422 | Loss: 0.0428
Epoch: 009/010 | Train: 99.43% | Validation: 98.70%
Time elapsed: 0.80 min
Epoch: 010/010 | Batch 0000/0422 | Loss: 0.0095
Epoch: 010/010 | Batch 0050/0422 | Loss: 0.0113
Epoch: 010/010 | Batch 0100/0422 | Loss: 0.0135
Epoch: 010/010 | Batch 0150/0422 | Loss: 0.0028
Epoch: 010/010 | Batch 0200/0422 | Loss: 0.0019
Epoch: 010/010 | Batch 0250/0422 | Loss: 0.0049
Epoch: 010/010 | Batch 0300/0422 | Loss: 0.0132
Epoch: 010/010 | Batch 0350/0422 | Loss: 0.0114
Epoch: 010/010 | Batch 0400/0422 | Loss: 0.0270
Epoch: 010/010 | Train: 99.55% | Validation: 98.68%
Time elapsed: 0.88 min
Total Training Time: 0.88 min
Test accuracy 98.66%