Implementations of various Deep Learning architectures. Includes MLPs, CNNs, RNNs, Seq2Seq, GANs.
Neural networks implemented from scratch in numpy.
- Multi-Layer Perceptrons:
- Fully-Connected 1-Layer - classification on College Admissions dataset
- Fully-Connected 2-Layer for Classification - classification on Fashion MNIST
- Fully-Connected 2-Layer for Regression - regression on Bike Sharing dataset
- Recurrent Models:
- Vanilla RNN Char-Level Many-2-One - synthetic counting task
- Vanilla RNN Char-Level Many-2-Many - generate dinosaur names
Neural networks implemented in keras.layers
API
- Multi-Layer Perceptrons:
- Multi-Layer Perceptron - classification on College Admissions
- Multi-Layer Perceptron - classification on MNIST
- Multi-Layer Perceptron - sentiment analysis on IMBD
- Convolutional Models:
- Convolutional Neural Network - classification on CIFAR-10
- CNN with Batch Normalization - classification on CIFAR-10
- CNN with Data Augmentation - classification on CIFAR-10
- ResNet-50 in Keras Layers API - classification on Oxford VGG Flower 17 dataset
- ResNet-50 Transfer Learning - classification on Oxford VGG Flower 17 dataset
- Recurrent Models:
- Seq-2-Seq LSTM with Embeddings - English to French translation on a small corpus
- Generative Models
- Autoencoder - fully connected autoencoder applied to MNIST
- Vanilla GAN - fully connected GAN on MNIST dataset
- DCGAN - deep convolutional GAN on CelebA dataset (incomplete)
This section included dataset preprocessing notebooks. These need to be run first before corresponding neural network notebooks.
- Image datasets
- Tiny ImageNet - download, explore and convert to .npz
- Oxford VGG Flowers 17 - download, explore and convert to .npz
- Stanford Dogs - download, explore and convert to .npz
Debugging techniques. Track input/output distributions, individual neuron weights, gradients, preactivation histograms,
Note this notebook has cool graphs, but no description