In the previous chapter, we examined the forward diffusion process, where we systematically added Gaussian noise to data over a series of timesteps until only noise remained. Now, we focus on the inverse operation: the reverse diffusion process. This is the generative part, where we aim to start from pure noise and progressively denoise it to produce a sample that resembles the original data distribution.
This chapter explains how this reversal is achieved. Since calculating the true reverse probability directly is often intractable, we will learn how to approximate it.
You will learn about:
By the end of this chapter, you will understand the core mechanics of how a diffusion model learns to generate data by systematically reversing the noising process.
3.1 The Goal: Reversing the Markov Chain
3.2 Approximating the Reverse Transition
3.3 Parameterizing the Reverse Process with Neural Networks
3.4 Predicting the Noise Component
3.5 Mathematical Formulation of the Denoising Step