Introduction to Diffusion Models for Generative AI
Chapter 1: Generative Modeling Fundamentals
Overview of Generative Models
Motivation for Diffusion Models
The Core Idea: Noise and Denoise
Probabilistic Framework Introduction
Chapter 2: The Forward Diffusion Process
Defining the Markov Chain
Mathematical Formulation per Step
Sampling from Intermediate Steps
Properties of the Forward Process
Practice: Simulating Forward Diffusion
Chapter 3: The Reverse Diffusion Process
The Goal: Reversing the Markov Chain
Approximating the Reverse Transition
Parameterizing the Reverse Process with Neural Networks
Predicting the Noise Component
Mathematical Formulation of the Denoising Step
Chapter 4: Model Architecture and Training
The U-Net Architecture for Noise Prediction
Integrating Timestep Information
Defining the Training Objective
Simplified Training Loss Derivation
Hands-on Practical: Setting up the U-Net
Chapter 5: Sampling and Generation Process
Generating Data from Noise
The DDPM Sampling Algorithm
Understanding Sampling Variance
Introduction to Faster Sampling: DDIM
The DDIM Sampling Algorithm
Trade-offs Between DDPM and DDIM
Practice: Implementing Sampling Loops
Chapter 6: Conditional Generation with Diffusion Models
Motivation for Conditional Generation
Classifier-Free Guidance (CFG)
Implementing Classifier-Free Guidance
Architecture Modifications for Conditioning
Hands-on Practical: Applying Guidance