Mixed-Precision Training, Paulius Micikevicius, Sharan Narang, Jonah Alben, Gregory Diamos, Erich Elsen, David Garcia, Boris Ginsburg, Michael Houston, Oleksii Kuchaiev, Ganesh Venkatesh, Hao Wu, 2018ICLR 2018DOI: 10.48550/arXiv.1710.03740 - Introduces the foundational techniques and benefits of mixed-precision training, including loss scaling, which are implemented in modern deep learning frameworks.
Automatic Mixed Precision (AMP), PyTorch Authors, 2025 - Official PyTorch documentation providing detailed API reference and practical usage guidance for torch.cuda.amp, autocast, and GradScaler.
Mixed precision, TensorFlow Authors, 2024 - Official TensorFlow guide on implementing mixed precision training, offering a direct comparison for developers transitioning from TensorFlow to PyTorch.