Before constructing more sophisticated diffusion models, it's necessary to ensure a solid grasp of the foundational elements. This chapter begins with a concise review of Denoising Diffusion Probabilistic Models (DDPM) and Denoising Diffusion Implicit Models (DDIM), focusing on the core mechanics, mathematical formulations like the forward process q(xt∣x0) and the reverse process pθ(xt−1∣xt), and the connection between score matching and ODEs.
We will then examine the standard noise scheduling techniques, such as linear and cosine schedules, evaluating their strengths and identifying their limitations in certain scenarios. Building upon this analysis, you will learn how to design custom noise schedules tailored to specific needs and explore methods for implementing learned variance schedules, where the model itself determines the optimal noise levels during training.
By the end of this chapter, you will have refreshed your understanding of core diffusion principles and gained practical knowledge for implementing and evaluating advanced noise scheduling strategies, setting the stage for the more complex architectures and training methods ahead.
1.1 Recap: Denoising Diffusion Probabilistic Models (DDPM)
1.2 Recap: Denoising Diffusion Implicit Models (DDIM)
1.3 Mathematical Underpinnings: Score Matching and ODEs
1.4 Limitations of Standard Noise Schedules
1.5 Designing Custom Noise Schedules
1.6 Learned Variance Schedules
1.7 Hands-on Practical: Implementing Noise Schedule Variants
© 2025 ApX Machine Learning