This course details the mechanics of normalizing flows for generative modeling. You will learn how to construct complex probability distributions from simple base distributions using invertible transformations. The material covers the mathematics of the change of variables theorem, Jacobian determinants, and the architecture of standard flow models like RealNVP and Masked Autoregressive Flows. Through code implementations in PyTorch, you will construct and train flow models to generate data and perform exact density estimation.
Prerequisites Calculus, Probability, and Python
Level:
Mathematical Foundations
Calculate exact probability densities by applying the change of variables theorem and Jacobian determinants.
Flow Architectures
Implement coupling layers, masked autoregressive flows, and planar flows from scratch.
Density Estimation
Train flow models using exact maximum likelihood estimation.
Generative Modeling
Sample from complex learned probability distributions efficiently using inverse passes.