Black Box Variational Inference, Rajesh Ranganath, Sean Gerrish, David Blei, 2014Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics (AISTATS), Vol. 33 - This foundational paper introduces Black Box Variational Inference (BBVI), providing a general framework for variational inference that bypasses the need for analytical gradient derivations, relying instead on Monte Carlo estimation. It is the direct source for the section's core topic.
Auto-Encoding Variational Bayes, Diederik P. Kingma, Max Welling, 2013International Conference on Learning Representations (ICLR) Workshop TrackDOI: 10.48550/arXiv.1312.6114 - This seminal work, while introducing Variational Autoencoders, popularized the reparameterization trick (also known as the pathwise derivative estimator) as a low-variance method for estimating gradients in variational inference, making it highly relevant to the detailed explanation of this technique.
Automatic Differentiation Variational Inference, Alp Kucukelbir, Dustin Tran, Robert R. Ma, Adeline Yu, Andrew Gelman, 2017Journal of Machine Learning Research (JMLR), Vol. 18 (Journal of Machine Learning Research)DOI: 10.5555/3122009.3122010 - This paper elaborates on Automatic Differentiation Variational Inference (ADVI), a practical implementation of BBVI that leverages modern automatic differentiation tools within probabilistic programming languages like Stan. It demonstrates how the general BBVI framework is applied in practice.