This chapter focuses on the first component of diffusion models: the forward process. This process systematically adds noise to an initial data point, like an image, over many steps until it approximates pure Gaussian noise. Understanding this controlled degradation is essential because the generative capability of diffusion models comes from learning to reverse this process.
We will begin by defining the forward process as a Markov chain, where each step adds a small amount of Gaussian noise according to a predetermined variance schedule (βt). You'll learn the mathematical equations governing a single noising step, often written as q(xt∣xt−1), and a useful closed-form expression that allows us to directly sample a noisy state xt at any timestep t from the initial data x0 without iterating through all intermediate steps. We will analyze properties of this forward mechanism and conclude with a practical exercise simulating the noise addition using code examples.
2.1 Defining the Markov Chain
2.2 Gaussian Noise Schedule
2.3 Mathematical Formulation per Step
2.4 Sampling from Intermediate Steps
2.5 Properties of the Forward Process
2.6 Practice: Simulating Forward Diffusion
© 2025 ApX Machine Learning