Fast.ai: Deep Learning from the Foundations
In this study group, we will be working through the Fast.ai deep learning course "Deep Learning from the Foundations":
This meetup is free and open to all. We recommend it to people who are already involved in Deep Learning, understand it's practical applications and want to get deeper understanding. Finishing Part 1 of Fast.ai course is sufficient.
We meet every 2 weeks, on Wednesday, 18:00-20:00.
The whole course will have at least 7 meetings - depending on our pace.
More info and schedule on wiki:
https://wiki.hs3.pl/wydarzenia/datascience#kurs_deep_learning_from_the_foundations_-_fastai
Welcome to Fast.ai Part 2: Deep Learning from the Foundations, which shows how to build a state of the art deep learning model from scratch.
It takes you all the way from the foundations of implementing matrix multiplication and back-propagation, through to high performance mixed-precision training, to the latest neural network architectures and learning techniques, and everything in between.
It covers many of the most important academic papers that form the foundations of modern deep learning, using “code-first” teaching, where each method is implemented from scratch in python and explained in detail (in the process, we’ll discuss many important software engineering techniques too).
Before starting this part, it's nice to have completed Fast.ai Part 1: Practical Deep Learning for Coders.
The first five lessons use Python, PyTorch, and the fastai library; the last two lessons use Swift for TensorFlow, and are co-taught with Chris Lattner, the original creator of Swift, clang, and LLVM.