Lesson 3

From Deep Learning Course Wiki
Revision as of 11:25, 11 September 2017 by Apurva (talk | contribs)
Jump to: navigation, search

Lesson resources

Assignments

  • Read through the following notebooks carefully:
    • lesson1.ipynb
    • lesson2.ipynb
    • sgd-intro.ipynb
    • convolution-intro.ipynb
    • lesson3.ipynb
  • Ask at least one question on the forums. Possible types of questions include:
    • What is the purpose of <something>?
    • Why do we write <something> in python?
    • Why is the output of <something> equal to <something>, instead of <something else>?
    • How do I fix this error?
    • Could I also use <some technique> for <something>?
    • Why aren't I getting a better position on the leaderboard for <competition> using <some process>?
  • Be sure that you can independently replicate the steps shown in each lesson notebook so far
  • Get a result in the top 50% of Dogs v Cats, if you haven't already
  • Get a result in the top 50% of State Farm
    • Be sure that you have created a validation set that gives similar results to submitting to Kaggle
    • Think about which layers of the pre-trained model are likely to be the most useful

CNN review

Today we reviewed all the key components of a convolutional neural network. Here are some resources you can use to help you if you are unclear on any piece:

Matrix Product (dense layers)

Convolutions (and Max-Pooling)

Activations

  • This documentation from the Stanford Course has an overview of commonly used activation functions and their pros vs. cons.

Stochastic Gradient Descent (SGD)

Backpropagation (Chain Rule)

Putting it all together

Additional Resources

  • This week in Episode 006 of the Startup Data Science podcast Apurva, Alex, and Edderic exchange views on their understanding of the difference between Covolution and Correlation and make sure they agree on the meaning of Max Pooling. Episode 007 talks about Dropout in detail. Apurva offers some tips to stay motivated to learn Deep Learning and Edderic announces revamping his PC workstation for deep learning (bye-bye Amazon!)