- How to use Google Colab for Python programming?
- Python3 / notebook
- Numpy / notebook
- Matplotlib & Plotly / notebook
Optional:
- Practice Python at codewars.org
- From Python to Numpy
- 100 numpy exercises by Nicolas P. Rougier
- Difference between AI, ML, and DL
- Introduction to deep learning
- The power of a hidden layer in neural networks
- How does machine learning (or deep learning) work? The intuition
- The four branches of machine learning
- Learning 'bleeding-edge' deep learning
Optional: Deep Learning In 5 Minutes | What Is Deep Learning?
- What are tensors? Matrix vs Tensor
- Tensors reshape automatically
- Examples of 3D, 4D, and 5D tensors
- The gears of neural networks: Tensor operations
- Geometric interpretation of deep learning
Optional: Lecture on TF2 by Joshwa Gordon @ Google
- Introduction to Keras
- Keras is also an API in Tensorflow2
- Keras sequential vs functional API
- Diversity of thought is holding back AI & deep learning research
- AlphaFold2: Example of the power of diversity
- Splitting data into development set + (training & validation) and test set + Callbacks
- Binary classification using feed-forward neural networks / notebook
Optional: Francois Chollet interview
- Image is all numbers (watch the first five minutes only)
- Data generators and image augmentation
- Image preprocessing / notebook
- Our eye and human visual system: Biological inspiration for convolutional neural networks
- Our eyes have blind spots / article
- Feed-forward (dense) vs Convolutional Neural networks
- Hulk vs. Ant Man
- The convolution operation
- A convolutional neuron (filter): An example
- The two main parameters of a convolutional layer
- How to calculate the number of parameters in a convolutional neural network? Some examples
- Border effect, padding, and maxpooling
- Separable convolutions and dilated convolutions
- A practical example: What can one convolutional neuron do? Detect a square. / notebook
- Classify MNIST digits using a CNN / notebook
Reading: Intuitively Understanding Convolutions for Deep Learning / alternative article
- How to choose the last layer’s activation and loss in NN?
- Softmax activation & other activations for deep neural networks
- How to choose a loss function for a regression problem?
- Cross entropy loss (log loss) for binary classification
- Categorial cross-entropy loss (softmax loss) for multi-class classification
- How to choose a loss function for a regression problem?
- How to choose an optimizer for a Tensorflow Keras model?
- The Blind Men and the Elephant
- Evaluating machine learning models: Measuring generalization
- Overfitting (variance) and underfitting (bias)
- How to prevent overfitting? Regularization techniques in deep learning
- L1 and L2 regularization
- Regularization using Dropout
- Regularization using Batch Normalization
- How to train deeper convolutional neural networks? / notebook
- Deep learning workflow/Recipe: From data to deep learning model
- How to debug a deep learning development pipeline?
- slides
- [Feature engineering] - slides
- [Multi-input and Multi-output models]
- [Layer weight sharing (The Siamese LSTM)]
- GPUs for deep learning - slides
- Transfer learning - slides / notebook
- What is Explainable AI (XAI)?
- [Techniques for interpreting a deep learning model]
Reading: Neural Network Follies
Optional: Resources for implementing the backpropagation algorithm:
- Yes you should understand backprop
- CSE 599G1: Deep Learning System, University of Washington
- Backpropagation, NNets
Optional: Andrew Ng: Advice on Getting Started in Deep Learning