Consistency Models, Yang Song, Prafulla Dhariwal, Mark Chen, Ilya Sutskever, 2023International Conference on Machine Learning (ICML)DOI: 10.48550/arXiv.2303.01469 - Introduces the consistency model framework, the core idea of one-step generation, and the consistency property itself. This is the foundational paper for the section's topic.
Score-Based Generative Modeling through Stochastic Differential Equations, Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole, 2021International Conference on Learning Representations (ICLR)DOI: 10.48550/arXiv.2011.13456 - Establishes the continuous-time formulation of diffusion models using Stochastic Differential Equations (SDEs) and Probability Flow ODEs, providing the theoretical basis for the ODE trajectories discussed.
Denoising Diffusion Probabilistic Models, Jonathan Ho, Ajay Jain, Pieter Abbeel, 2020Advances in Neural Information Processing Systems (NeurIPS), Vol. 33DOI: 10.48550/arXiv.2006.11239 - A seminal paper popularizing modern diffusion models, illustrating the iterative sampling process that consistency models aim to circumvent. Provides context for the 'standard diffusion samplers' mentioned.
Elucidating the Design Space of Diffusion-Based Generative Models, Tero Karras, Miika Aittala, Timo Aila, Samuli Laine, 2022Advances in Neural Information Processing Systems (NeurIPS), Vol. 35DOI: 10.48550/arXiv.2206.00364 - Provides a unified theoretical and practical framework for various diffusion models, including insights into the ODE perspective and sampling strategies that lead to fast inference.