Mixed-Precision Training, Paulius Micikevicius, Sharan Narang, Jonah Alben, Gregory F. Diamos, Erich Elsen, David Garcia, Boris Ginsburg, Michael Houston, Oleksii Kuchaiev, Ganesh Venkatesh, Hao Wu, 2018International Conference on Learning Representations (ICLR)DOI: 10.48550/arXiv.1710.03740 - The original research paper introducing mixed-precision training with dynamic loss scaling, detailing its principles and benefits.
Automatic Mixed Precision (AMP) for Deep Learning Training, NVIDIA Developer Documentation, 2023 (NVIDIA) - A comprehensive guide from NVIDIA explaining the advantages, implementation details, and best practices for mixed-precision training.
Automatic Mixed Precision (AMP), PyTorch Authors, 2025 (PyTorch Documentation) - Official PyTorch documentation providing detailed usage instructions and examples for torch.cuda.amp, autocast, and GradScaler.
Mixed precision, TensorFlow Authors, 2023 (TensorFlow Documentation) - Official TensorFlow guide on enabling and configuring mixed-precision training using tf.keras.mixed_precision.set_global_policy within the Keras API.