Consistency Models, Yang Song, Prafulla Dhariwal, Mark Chen, Ilya Sutskever, 2023ICML 2023DOI: 10.48550/arXiv.2303.01469 - Introduces consistency models and the consistency distillation method for training them, allowing fast one-step generation.
Denoising Diffusion Probabilistic Models, Jonathan Ho, Ajay Jain, and Pieter Abbeel, 2020Advances in Neural Information Processing Systems (NeurIPS 2020)DOI: 10.48550/arXiv.2006.11239 - Presents the core architecture and training approach for Denoising Diffusion Probabilistic Models, often used as teacher models in consistency distillation.
Diffusion Models Beat GANs on Image Synthesis, Prafulla Dhariwal, Alex Nichol, 2021Advances in Neural Information Processing Systems (NeurIPS 2021)DOI: 10.48550/arXiv.2105.05233 - Details how to train high-quality diffusion models with improved sampling efficiency, making them effective teacher models for distillation techniques.