Lesson 6 Timeline

From Deep Learning Course Wiki
Revision as of 14:10, 26 December 2016 by Lbarrera (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Lesson 6 Video

  • 0:01 Pseudo-labeling
    • 1:15 MixIterator introduction
  • 6:57 Review: Embeddings
    • 8:10 Embeddings example: MovieLens Data Set
    • 13:30 Word embeddings example: Green Eggs and Ham
  • 15:33 RNNs
    • 20:00 Visual vocabulary for representing neural nets
    • 22:56 3 kinds of layer operations
  • 25:30 Building first char-RNN in Keras
    • 27:28 Predict 4th character from previous 3
    • 38:45 Generalize first char-RNN formulation: Predict char n from chars 1 to n-1
    • 42:20 RNN from standard Keras dense layers
    • 48:25 Initialization for hidden to hidden dense layer (identity matrix)
  • 51:36 Alternative char-RNN formulation: Predict chars 2 to n using chars 1 to n-1 (sequence to sequence)
    • 1:02:08 Stateful model with Keras (long-term dependencies)
  • 1:23:01 Build RNN in Theano
    • 1:25:46 Aside: "loss=sparse_categorical_entropy" alternative to one-hot encoding of output
    • 1:27:30 Aside: One-hot sequence model with Keras
    • 1:28:50 Theano overview
    • 1:29:50 Theano concepts: Variable
    • 1:35:50 "theano.scan" operation (RNN steps)
    • 1:43:20 Theano error/loss
    • 1:43:48 "theano.grad" calculate derivatives
    • 1:44:43 "theano.function"
  • 1:49:06 Lesson goals, plans
  • 1:50:15 In-class questions
    • 1:56:59 Tip: Exploring layer definitions in keras
    • 2:01:05 Tip: shift-tab
    • 2:01:40 Tip: Python debugger in Jupyter notebook