Denoising Diffusion Implicit Models, Jiaming Song, Chenlin Meng, Stefano Ermon, 2020ICLR 2021DOI: 10.48550/arXiv.2010.02502 - Introduces DDIM, a non-Markovian generative process for diffusion models that enables significantly faster and deterministic sampling without retraining.
Denoising Diffusion Probabilistic Models, Jonathan Ho, Ajay Jain, and Pieter Abbeel, 2020Advances in Neural Information Processing Systems (NeurIPS)DOI: 10.48550/arXiv.2006.11239 - The foundational paper that introduced Denoising Diffusion Probabilistic Models (DDPMs), defining the forward and reverse processes.
Improved Denoising Diffusion Probabilistic Models, Alexander Quinn Nichol, Prafulla Dhariwal, 2021Proceedings of the 38th International Conference on Machine Learning, Vol. 139 (PMLR)DOI: 10.1109/ICCV48922.2021.01050 - Proposes the cosine variance schedule and other improvements to DDPMs, enhancing sample quality and training stability.
Score-Based Generative Modeling through Stochastic Differential Equations, Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole, 2021International Conference on Learning Representations (ICLR)DOI: 10.48550/arXiv.2011.13456 - Presents a unified framework for score-based generative models and diffusion models, highlighting the connection between deterministic sampling (like DDIM with η=0) and probability flow ODEs.