This wiki is here to help you develop your capabilities in using deep learning to solve real world problems. Please help us to develop it by adding, editing, and organizing any information that you think might be helpful towards this goal. You can keep up to date with all changes to the wiki by watching the Recent Changes page.
There's some good content here already - and we hope that by the end of the course this will be a fantastic information resource... thanks to your contributions! Don't be shy - click 'edit' on any page and make it better. (Note to MOOC students: we haven't yet developed a system for giving you accounts, so you're not yet able to edit the wiki) There is information available about how to format pages.
Go here to view the Frequently Asked Questions about the course.
How to ask for Help
First and foremost, please read this How to ask for Help page on how to ask for help in a way that will allow others to most quickly and effectively be able to help you.
Also, please read this guide on How to use the Provided Notebooks.
Resources, FAQs, links, further discussion, videos, etc about each week's lecture. And here is the google drive for transcripts of lessons (with thanks to Lin Crampton), and a page of all of the course notes (with thanks to Brad Kenstler).
- Lesson 0 - The "surprise lecture" during the data institute launch. It was an introduction to convolutions and machine learning for computer vision
- Lesson 1 - Getting started with deep learning tools for computer vision
- Lesson 2 - The basic algorithms underlying deep learning
- Lesson 3 - Review of the components of a CNN; avoiding over-fitting and under-fitting
- Lesson 4 - Convolutions and SGD gradient tutorials; State farm: learning rate selection, data augmentation tuning, pseudo-labeling and knowledge distillation; Intro to collaborative filtering
- Lesson 5 - Adding batchnorm to VGG; visualizing latent factors; functional API; NLP and word embeddings; Multi-size CNNs; RNN introduction
- Lesson 6 - MixIterator for pseudo-labeling and combining validation/training sets; Embedding Excel examples; RNNs (creating layers by hand in keras; sequence and stateful RNNs; simple RNN in Theano)
- Lesson 7 - CNN architectures: resnet, inception, fully convolutional net, multi input and multi output nets; localization with bounding box models and heatmaps; using larger inputs to CNNs; building a simple RNN in pure python; Gated recurrent units (GRUs), and how to build a GRU RNN in theano
- Wiki - this wiki is an important resource. Information about how to get the most out of it (and what to put in it!) is available here
- Forums - the fast.ai forums are where you can ask questions and get involved in discussions. Use this section to find out how best to participate in the forums
- Slack - we use Slack for real time communication, including during class
- Directories on Github - setup scripts, course materials
- Class emails
Useful hardware and software tools for deep learning
- Installation - you need an Nvidia GPU to do most things with deep learning; this section shows how to set up a GPU server using a number of common platforms. Most people are likely to want to use Amazon Web Services (AWS): AWS install
- Jupyter notebook - the tool we use for most of our analysis, so be sure to take the time to learn to use it effectively!
- Python libraries - you will be making a lot of use of numpy, keras, and matplotlib, and various other libraries. We will collect helpful information about these libraries here
- Bash - Resources for learning about bash
- SQL - Resources for practicing SQL stuff
- Tmux - How to use tmux to manage terminal windows
- aws-alias - How to use this file and suggestions to add/delete/modify aws-alias.sh
- Kaggle CLI - How to get started with the Kaggle command line tool
- Your Deep Learning Setup - How your entire setup ties together
- Github - We store a central version of the jupyter notebooks and excel spreadsheets used in the videos in our github repository
Important concepts in machine learning, deep learning and math.
- Deep Learning Glossary - glossary of concepts related to Deep Learning
- Gradient Descent - Gradient descent, stochastic gradient descent (SGD), and optimizing cost functions
- Log Loss - review of log loss and cross-entropy
- Linear Algebra for Deep Learning - review of basic Linear Algebra concepts used in Deep Learning.
- Calculus for Deep Learning - review of basic Calculus concepts used in Deep Learning.
- Mathematical Notation - primer/cheatsheet on math symbols
- Linear Regression - Intro to linear regression with code examples
- Logistic Regression - Intro to logistic regression with code examples
- Neural Networks - Intro to neural networks and backpropagation with code examples
Links to relevant supporting material
- Image Datasets- Descriptions, difficulty ratings, and links to datasets you may want to use for your class project, including links to public lists of datasets
- Tutorials - helpful tutorials and MOOCs to complement coursework
- Papers - important papers in deep learning
- Articles - articles and blog posts
- Books - Textbooks and other non-fiction related to Deep Learning
- Listen to the Podcast based on the video lessons
- ELI5 - Explain Like I'm 5 [years old]
- Applications - Application areas and resources
- Build Your own DL Box
- Docs for keras 1.1.1 which is used in the course