Skip to content

Latest commit

 

History

History
110 lines (91 loc) · 7.85 KB

LECTURES.md

File metadata and controls

110 lines (91 loc) · 7.85 KB

Course topics

1. Introduction to Python and Numpy

  1. How to use Google Colab for Python programming?
  2. Python3 / notebook
  3. Numpy / notebook
  4. Matplotlib & Plotly / notebook

Optional:

  1. Practice Python at codewars.org
  2. From Python to Numpy
  3. 100 numpy exercises by Nicolas P. Rougier

2. Introduction to deep learning (Sections 1.1, 1.2, 1.3, and 4.1)

  1. Difference between AI, ML, and DL
  2. Introduction to deep learning
  3. The power of a hidden layer in neural networks
  4. How does machine learning (or deep learning) work? The intuition
  5. The four branches of machine learning
  6. Learning 'bleeding-edge' deep learning

Optional: Deep Learning In 5 Minutes | What Is Deep Learning?

3. Data representations & tensor operations (Sections 2.2, 2.3, and 2.4)

  1. What are tensors? Matrix vs Tensor
  2. Tensors reshape automatically
  3. Examples of 3D, 4D, and 5D tensors
  4. The gears of neural networks: Tensor operations
  5. Geometric interpretation of deep learning

Optional: Lecture on TF2 by Joshwa Gordon @ Google

4. Introduction to Keras (Sections 3.2 and 3.3)

  1. Introduction to Keras
  2. Keras is also an API in Tensorflow2
  3. Keras sequential vs functional API
  4. Diversity of thought is holding back AI & deep learning research
  5. AlphaFold2: Example of the power of diversity
  6. Splitting data into development set + (training & validation) and test set + Callbacks
  7. Binary classification using feed-forward neural networks / notebook

Optional: Francois Chollet interview

5. Preparing images for deep learning (Sections 3.6.2, 5.2.4, and 5.2.5)

  1. Image is all numbers (watch the first five minutes only)
  2. Data generators and image augmentation
  3. Image preprocessing / notebook

6. The convolution operation (Section 5.1.1)

  1. Our eye and human visual system: Biological inspiration for convolutional neural networks
  2. Our eyes have blind spots / article
  3. Feed-forward (dense) vs Convolutional Neural networks
  4. Hulk vs. Ant Man
  5. The convolution operation
  6. A convolutional neuron (filter): An example
  7. The two main parameters of a convolutional layer
  8. How to calculate the number of parameters in a convolutional neural network? Some examples
  9. Border effect, padding, and maxpooling
  10. Separable convolutions and dilated convolutions
  11. A practical example: What can one convolutional neuron do? Detect a square. / notebook
  12. Classify MNIST digits using a CNN / notebook

Reading: Intuitively Understanding Convolutions for Deep Learning / alternative article

7. Activations & loss functions (Sections 4.5.5, and Table 4.1)

  1. How to choose the last layer’s activation and loss in NN?
  2. Softmax activation & other activations for deep neural networks
  3. How to choose a loss function for a regression problem?
  4. Cross entropy loss (log loss) for binary classification
  5. Categorial cross-entropy loss (softmax loss) for multi-class classification
  6. How to choose a loss function for a regression problem?
  7. How to choose an optimizer for a Tensorflow Keras model?

8. Model evaluation, overfitting, underfitting, & regularization (Sections 4.2, 4.4, and 4.5)

  1. The Blind Men and the Elephant
  2. Evaluating machine learning models: Measuring generalization
  3. Overfitting (variance) and underfitting (bias)
  4. How to prevent overfitting? Regularization techniques in deep learning
  5. L1 and L2 regularization
  6. Regularization using Dropout
  7. Regularization using Batch Normalization
  8. How to train deeper convolutional neural networks? / notebook
  9. Deep learning workflow/Recipe: From data to deep learning model
  10. How to debug a deep learning development pipeline?

9. Classic CNN architectures (Sections 5.1.1, 5.1.2, and 7.1)

10. Deep learning practices (Sections 4.3, 5.3, 5.4, and 7.1)

  1. slides
  2. [Feature engineering] - slides
  3. [Multi-input and Multi-output models]
  4. [Layer weight sharing (The Siamese LSTM)]
  5. GPUs for deep learning - slides
  6. Transfer learning - slides / notebook
  7. What is Explainable AI (XAI)?
  8. [Techniques for interpreting a deep learning model]

Reading: Neural Network Follies

11. Limitations of deep learning (Section 9.2)

  1. Goals of deep learning
  2. Limitations of deep learning

Optional: Resources for implementing the backpropagation algorithm:

Optional: Andrew Ng: Advice on Getting Started in Deep Learning