Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - Provides a comprehensive theoretical and mathematical foundation for neural network layers, including fully connected layers, weights, biases, and activation functions.
Layers and Models, The Flux.jl Contributors, 2025 (Flux.jl Documentation) - Official documentation for Flux.jl, detailing the Dense layer's constructor, arguments, and behavior, serving as an authoritative guide for its implementation.
Understanding the difficulty of training deep feedforward neural networks, Xavier Glorot, Yoshua Bengio, 2010Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS 2010), Vol. 9 (JMLR.org) - This seminal paper introduced the Glorot (Xavier) initialization strategy for neural network weights, which is the default for Dense layers in Flux.jl.
Neural Networks: The Basics, Andrej Karpathy, Justin Johnson, and Serena Yeung, 2023 - Provides a clear, accessible introduction to the foundational concepts of neural networks, including the structure and operation of fully connected layers, weights, and biases.