Mixed-Precision Training of Deep Neural Networks, Paulius Micikevicius, Sharan Narang, Jonah Alben, Gregory Diamos, Erich Elsen, David Garcia, Boris Ginsburg, Michael Houston, Oleksii Kuchaiev, Ganesh Venkatesh, Hao Wu, 2018International Conference on Learning Representations (ICLR)DOI: 10.48550/arXiv.1710.03740 - Introduced the fundamental techniques for mixed-precision training, including FP32 master weights and loss scaling, which form the basis of modern implementations.
Automatic Mixed Precision (AMP) Examples, PyTorch Contributors, 2024 - Official guide providing practical examples and details for implementing mixed-precision training using PyTorch's torch.cuda.amp module.
Mixed precision, TensorFlow Team, 2023 - Official guide for enabling mixed-precision training within TensorFlow Keras, explaining policy configuration and automatic loss scaling.