Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - Comprehensive coverage of optimization algorithms, including gradient descent variants, momentum, and adaptive methods like Adam and RMSprop.
Keras Optimizers, Keras team, 2024 - Official Keras API documentation for optimization algorithms, with practical usage examples and parameter configurations.
Optimizers and Learning Rate Scheduling, Stanford University CS231n team, 2023 - Provides detailed explanations of various optimization algorithms, including RMSprop, momentum, and different learning rate strategies.