Mixed-Precision Training for Deep Neural Networks, Paulius Micikevicius, Sharan Narang, Jonah Alben, Gregory Diamos, Erich Elsen, David Garcia, Boris Ginsburg, Michael Houston, Oleksii Kuchaiev, Ganesh Venkatesh, Hao Wu, 2018ICLRDOI: 10.48550/arXiv.1710.03740 - Introduces the fundamental concepts and benefits of mixed-precision training, including loss scaling, which are core to the section content.
Automatic Mixed Precision (AMP), PyTorch Contributors, 2025 (PyTorch) - Official documentation for PyTorch's torch.cuda.amp, detailing its usage and implementation for mixed-precision training.
Mixed precision, TensorFlow Team, 2023 (TensorFlow) - Official guide for implementing mixed precision training in TensorFlow and Keras, covering tf.keras.mixed_precision.
Training with Mixed Precision, NVIDIA, 2020 (NVIDIA) - An accessible overview and practical guide from NVIDIA explaining mixed precision training, its advantages, and how it works with Tensor Cores.