Foundations of Deep Learning
This notebook is created from lectures 13 & 14 of fastai part-2 as a reference. I have decided to organize it in a slightly different ways, added more explanations, and left out implementations that might not be ‘foundational’ in nature. The lecture also uses a library fastcore that I have avoided using for anyone new. We will cover all the basic blocks of deep learning (1) Basic neural network architecture (2) Multi-Layer Perceptron (MLP) implementation (3) Gradients and derivatives (4) Chain rule and backpropagation (5) PyTorch for calculating derivatives (6) ReLU and linear function classes (7) Log sum exp trick (8) log_softmax() function and cross entropy loss (9) Training loop for a simple neural network (10) Refactoring code for efficiency and flexibility (11) PyTorch’s nn.Module and nn.Sequential (12) Creating custom PyTorch modules (13) Implementing optimizers, DataLoaders, and Datasets (14) Visualizing our data (15) Building a proper training loop using PyTorch DataLoader It’s gonna be a lot but let’s buckle in and take these important backbones one at a time :)
Stable Diffusion Pipelines and Deep Dive
Generative AI has been a big topic of discussion and one of the biggest tools for Gen AI last year was Stable Diffusion. While I was just getting started on my ML journey, these tools were all around me. Now that I have finally gotten enough experience to understand these concepts, I have decided to write this blog on Stable Diffusion. Because my interests are not really in generating cool artwork (possibly a result of controversial training data) but instead in using these generative principles for more useful work like in the medical field or even in sciences, I have decided to dive a bit deeper than just using the tools and instead gaining a little more control over each step associated with the generation process. However, there is much more to dig into even for the specific steps in the diffusion process and that is something I am hoping to do in the near future. Finally, I must say that everything in this notebook can be found in course.fast.ai lesson 9 :). I have just tried to add more explanations for others and for myself for future reference.