Random Forests, Leo Breiman, 2001Machine Learning, Vol. 45DOI: 10.1023/A:1010933404324 - A foundational paper introducing the Random Forest algorithm, detailing how bootstrapping and random feature selection enhance predictive accuracy and prevent overfitting.
Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, 2014Journal of Machine Learning Research, Vol. 15 (JMLR) - The seminal paper introducing dropout as a regularization technique for neural networks, explaining its mechanism and effectiveness in reducing overfitting.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Trevor Hastie, Robert Tibshirani, Jerome Friedman, 2009 (Springer) - A comprehensive textbook covering statistical learning methods, including ensemble methods (Bagging, Random Forests) and regularization, providing theoretical foundations. 2nd edition.
Deep Learning, Ian Goodfellow, Yoshua Bengio, Aaron Courville, 2016 (MIT Press) - An authoritative textbook on deep learning, featuring detailed explanations of dropout, stochastic gradient descent, and other neural network optimization and regularization techniques.