Normalizing flows operate on a specific mathematical premise. They map a simple, known probability distribution to a highly complex target distribution using a sequence of invertible functions. To build and train these models in PyTorch, we must first establish the rules governing how probability densities update when transformed.
Let's take for example a random variable from a simple base distribution. If we apply an invertible function to it, we get a new random variable . The relationship between their probability densities is defined by the change of variables theorem:
This chapter focuses on the mathematical mechanics that make this equation work. You will learn how to apply the change of variables theorem to track density changes across different functions. We will study the Jacobian matrix and its determinant, which measures how the volume of space expands or contracts during a mathematical transformation.
You will also learn to differentiate between the forward and inverse operations of a flow model. The forward pass evaluates the exact probability density of data, while the inverse pass allows us to sample new data points from the learned distribution. The chapter concludes with a programming exercise where you will write Python code to calculate Jacobian determinants for simple mathematical functions, preparing you to implement full flow layers in the subsequent chapters.
1.1 Random Variables and Distributions
1.2 The Change of Variables Theorem
1.3 Computing Jacobian Determinants
1.4 Forward and Inverse Passes
1.5 Calculating Jacobians Practice