Deep Learning, Ian Goodfellow, Yoshua Bengio, Aaron Courville, 2016 (MIT Press) - A comprehensive textbook covering the theoretical foundations and practical aspects of deep learning, with dedicated sections explaining activation functions, their characteristics, and their impact on network training.
Activation Functions, The Flux.jl Community, 2024 - The official documentation for activation functions available in the Flux.jl library, providing usage examples and implementation details directly relevant to building models with Flux.
Rectifier Nonlinearities Improve Neural Network Acoustic Models, Andrew L. Maas, Awni Y. Hannun, Andrew Y. Ng, 2013Proceedings of the 30th International Conference on Machine Learning (ICML), Vol. 28 (Proceedings of Machine Learning Research)DOI: 10.5591/978-1-57735-611-3-A109 - Introduces and evaluates Leaky Rectified Linear Units (Leaky ReLUs), demonstrating their effectiveness in mitigating the "dying ReLU" problem by allowing a small, non-zero gradient for negative inputs.
Dive into Deep Learning, Aston Zhang, Zack C. Lipton, Mu Li, Alex Smola, 2024 (Cambridge University Press) - An interactive and accessible online textbook that offers both theoretical explanations and practical implementations of deep learning models, including detailed discussions on various activation functions and their applications.