Okay, you've learned how to find the derivative of functions like f(x)=xn using the Power Rule. That's a great start! But what about functions that are even simpler, like f(x)=5? Or functions that are combinations of terms, like f(x)=x2+x?
We need a couple more rules to handle these situations. Fortunately, these rules are quite intuitive and follow directly from the idea of the derivative as a rate of change or the slope of a tangent line. In this section, we'll cover how to find the derivatives of constants and sums (or differences) of functions.
Let's start with the simplest type of function: a constant function. Consider the function:
f(x)=5
This function always outputs the value 5, no matter what input value x you give it. What does its graph look like? It's a horizontal line passing through 5 on the y-axis.
The graph of a constant function f(x)=c is a horizontal line.
Remember that the derivative represents the slope of the tangent line to the function's graph. What's the slope of a horizontal line? It's zero. The line isn't rising or falling; its rate of change is zero everywhere.
This gives us our first rule:
The Constant Rule: If c is a constant number, then the derivative of the function f(x)=c is zero.
Using derivative notation:
If f(x)=c, then f′(x)=0.
Alternatively, using Leibniz notation:
dxd(c)=0
This makes sense intuitively: if a quantity never changes (it's constant), its rate of change is zero.
Examples:
Now, what if we have a function formed by adding or subtracting simpler functions? For instance, consider:
h(x)=x2+x
We know how to find the derivative of x2 (it's 2x by the Power Rule) and the derivative of x (which is x1, so its derivative is 1x0=1, also by the Power Rule). How does the addition affect the derivative?
Calculus provides a straightforward rule: the derivative of a sum of functions is simply the sum of their individual derivatives. The same applies to differences.
The Sum Rule: If h(x)=f(x)+g(x), then h′(x)=f′(x)+g′(x).
In Leibniz notation:
dxd[f(x)+g(x)]=dxdf(x)+dxdg(x)
The Difference Rule: If h(x)=f(x)−g(x), then h′(x)=f′(x)−g′(x).
In Leibniz notation:
dxd[f(x)−g(x)]=dxdf(x)−dxdg(x)
Essentially, you can differentiate a function term by term.
Let's apply this to our example h(x)=x2+x:
What about a function like f(x)=3x2? Here, x2 is multiplied by a constant, 3. There's a simple rule for this too. Constant factors just "tag along" when you differentiate.
The Constant Multiple Rule: If h(x)=c⋅f(x), where c is a constant, then h′(x)=c⋅f′(x).
In Leibniz notation:
dxd[c⋅f(x)]=c⋅dxdf(x)
So, for f(x)=3x2:
Now we can combine the Constant Rule, Sum/Difference Rule, Constant Multiple Rule, and Power Rule to differentiate any polynomial function. A polynomial is just a sum of terms where each term is a constant multiplied by x raised to a non-negative integer power (like 5x3−2x+7).
Let's find the derivative of p(x)=4x3−5x2+x−2.
We can differentiate this term by term:
Now, combine the derivatives of each term using the Sum/Difference Rule:
p′(x)=12x2−10x+1−0
p′(x)=12x2−10x+1
With these rules (Constant, Constant Multiple, Sum/Difference, and Power), you now have the tools to find the derivative of any polynomial function. These form the basic toolkit for differentiation, which we'll use extensively when we start looking at how to optimize functions, a core task in training machine learning models.
© 2025 ApX Machine Learning