# Lesson 5

From Deep Learning Course Wiki

## This week's links

- The lesson video
- The lesson 5 notes (includes time lines)

- The notebooks:
- Lesson 5 shows the IMDB sentiment analysis
- Word vectors contains the visualization of glove vectors
- char-rnn is the RNN "Nietzsche generator" - we only briefly looked at this; we'll be discussing this notebook more next week
- Imagenet batchnorm is the method used to add batchnorm to imagenet. This is optional - now that we've done this for you, it's included in vgg16bn.py; we're providing the notebook for those of you that are interesting in learning how we did it

- The python scripts:
- The VGG network with batchnorm - we will use this now instead of vgg16.py and automatically downloads the new weights when first used
- utils.py - For finetuning, we will start using vgg_ft_bn (which uses VGG with batch norm) instead of vgg_ft

- The datasets:
- The IMDB dataset is part of keras, and download code is part of the lesson 5 notebook.

- Pre-trained networks:

## Information about this week's topics

- Ben's excellent blog post describing how he used deep learning for NLP at his startup, Quid
- Learning Word Vectors for Sentiment Analysis - the Stanford paper introducing the IMDB sentiment dataset
- An introduction to Principal Component Analysis (PCA)
- Understanding Convolutions - Chris Olah's blog
- Exploring the Limits of Language Modeling

## Assignments

- Try to make sure you've completed the key goals from previous weeks - top 50% of kaggle result on each of:
- dogs and cats redux
- state farm

- if you've already done that, try to either:
- beat my imdb sentiment result, or (better still)...
- use your own text classification dataset - like @bowles did in https://quid.com/feed/how-quid-uses-deep-learning-with-small-data .