Keras layers API, TensorFlow Developers, 2024 - Official documentation providing detailed information and examples for all Keras layers.
Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - A comprehensive textbook covering the theoretical foundations and mathematical principles of various neural network layers, including Dense, Conv2D, Pooling, Dropout, and LSTMs.
Gradient-Based Learning Applied to Document Recognition, Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner, 1998Proceedings of the IEEE, Vol. 86 (IEEE)DOI: 10.1109/5.726791 - Foundational paper introducing Convolutional Neural Networks (LeNet-5) for image recognition, demonstrating the original application of Conv2D and pooling layers.
Long Short-Term Memory, Sepp Hochreiter and Jürgen Schmidhuber, 1997Neural Computation, Vol. 9 (MIT Press)DOI: 10.1162/neco.1997.9.8.1735 - The original paper proposing Long Short-Term Memory (LSTM) networks, which address the vanishing gradient problem in recurrent neural networks for sequence processing.
Dropout: A Simple Way to Prevent Overfitting Neural Networks, Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, 2014Journal of Machine Learning Research (JMLR), Vol. 15 (JMLR)DOI: 10.5555/2627435.2670313 - Introduces Dropout as a regularization technique to reduce overfitting in neural networks by randomly omitting units during training.