The libraries you will be using the most frequently are:
- numpy - Provides the basic linear algebra (i.e. array/vector/matrix) functionality for python. If you've used modern linear algebra capabilities in another language you should find it very familiar. If not, don't worry, we'll help you get started quickly!
- keras - The main deep learning library we will be using. It is a wrapper on top of theano or tensorflow, that makes these other libraries much easier to use.
- matplotlib - Provides plotting and scientific visualization capabilities
You'll also occassionaly use:
- scikit-learn - Lots of helpful functionality for general machine learning, such as cross-validation, confusion matrices, and many ML algorithms.
- scipy - A wide variety of scientific computing capabilities that are more specialized than what's in numpy.
- theano - A powerful computing system for python that runs your python code on a GPU. When using keras, we'll actually be using theano behind the scenes; and we can extend keras using our own theano code. We'll only need to use theano directly when we're covering more advanced topics.
I recommend the book Python for Data Analysis for excellent coverage of numpy, jupyter/ipython, and matplotlib. It also covers Pandas, a library for processing structured data in python - we won't be using that much, if at all, so feel free to skip those sections.
There are also copious free tutorials on all of these tools online. If you find some you particularly like, please add them here! The best way to learn is by experimenting in a Jupyter notebook with the libraries.