Building on our understanding of single-variable functions, we now turn our attention to functions that depend on multiple inputs. This is essential because most machine learning models, from linear regression with several features to complex neural networks, involve optimizing functions with many variables or parameters. Analyzing how these functions change as we adjust their inputs requires extending the concepts of calculus.
In this chapter, you will learn the fundamentals of multivariable calculus relevant to machine learning:
These concepts form the basis for understanding how to navigate and optimize high-dimensional surfaces, such as the cost functions minimized during model training. Mastering gradients is key to understanding the optimization algorithms discussed later. We'll conclude with a practical exercise using NumPy to compute gradients numerically.
3.1 Functions of Multiple Variables
3.2 Partial Derivatives
3.3 The Gradient Vector
3.4 Directional Derivatives
3.5 The Hessian Matrix
3.6 Multivariable Optimization Concepts
3.7 Hands-on Practical: Computing Gradients with NumPy
© 2025 ApX Machine Learning