Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - This book offers a comprehensive theoretical and practical discussion of L1, L2 (weight decay), and Dropout regularization within the context of deep neural networks.
Dropout: A Simple Way to Prevent Overfitting in Neural Networks, Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, 2014Journal of Machine Learning Research, Vol. 15(56) (JMLR) - The seminal paper introducing Dropout as a regularization technique for neural networks, explaining its mechanism and effectiveness in preventing co-adaptation.
Keras Regularizers API, Keras Team, 2024 - Official Keras documentation for applying L1, L2, and L1_L2 regularization to layers, including code examples and parameter details.
Keras Dropout Layer, Keras Team, 2024 - Official Keras documentation explaining the Dropout layer, its usage, and parameters for implementing dropout regularization.