From Deep Learning Course Wiki
- The notebooks:
- Lesson 5 shows the IMDB sentiment analysis
- Word vectors contains the visualization of glove vectors
- char-rnn is the RNN "Nietzsche generator" - we only briefly looked at this; we'll be discussing this notebook more next week
- Imagenet batchnorm is the method used to add batchnorm to imagenet. This is optional - now that we've done this for you, it's included in vgg16bn.py; we're providing the notebook for those of you that are interesting in learning how we did it
- The python scripts:
- The datasets:
- The IMDB dataset is part of keras, and download code is part of the lesson 5 notebook.
- Pre-trained networks:
Information about this week's topics
- Ben's excellent blog post describing how he used deep learning for NLP at his startup, Quid
- Learning Word Vectors for Sentiment Analysis - the Stanford paper introducing the IMDB sentiment dataset
- An introduction to Principal Component Analysis (PCA)
- Understanding Convolutions - Chris Olah's blog
- Exploring the Limits of Language Modeling
- Try to make sure you've completed the key goals from previous weeks - top 50% of kaggle result on each of:
- dogs and cats redux
- state farm
- if you've already done that, try to either:
- beat my imdb sentiment result, or (better still)...
- use your own text classification dataset - like @bowles did in https://quid.com/feed/how-quid-uses-deep-learning-with-small-data .