In the previous chapters, we saw how vectors represent data points and how matrices transform them. Now, let's look at how we can combine vectors to create new ones. This process, fundamental to understanding vector spaces, is called forming linear combinations.
A linear combination of a set of vectors v1,v2,...,vk is any vector w that can be expressed in the form:
w=c1v1+c2v2+...+ckvkwhere c1,c2,...,ck are scalar constants (real numbers). Essentially, you take your set of vectors, scale each one by some amount (including potentially zero or negative amounts), and then add the scaled vectors together. The result is a linear combination.
Think about it geometrically in 2D. If you have a single vector v1, its linear combinations are just scalar multiples of v1, forming a line passing through the origin in the direction of v1.
If you have two vectors, say v1=[10] and v2=[01] in R2, what vectors can you create? A linear combination looks like w=c1[10]+c2[01]=[c1c2]. By choosing different scalars c1 and c2, you can reach any point (vector) in the 2D plane.
This leads us to the concept of span. The span of a set of vectors {v1,v2,...,vk} is the set of all possible linear combinations of these vectors. It represents the entire region or space that can be "reached" or generated by combining these vectors.
Let's visualize the span of two non-collinear vectors in 2D. Consider v1=[21] and v2=[−13]. Their span is the entire R2. Any vector in R2, like w=[45], can be written as a linear combination c1v1+c2v2.
Two non-collinear vectors v1 and v2. Any vector w in the plane (like the dotted green vector) can be formed by scaling and adding v1 and v2. The dashed lines represent the infinite lines passing through the origin along v1 and v2. The span fills the entire 2D plane.
In machine learning, features often form vectors. A dataset can be seen as a collection of these feature vectors. The concept of span helps us understand the "reach" of our features.
NumPy makes calculating linear combinations straightforward.
import numpy as np
# Define vectors
v1 = np.array([2, 1])
v2 = np.array([-1, 3])
# Define scalars
c1 = 2.1538 # Approximately 28/13
c2 = 0.3077 # Approximately 4/13
# Calculate the linear combination w = c1*v1 + c2*v2
w = c1 * v1 + c2 * v2
print(f"Vector v1: {v1}")
print(f"Vector v2: {v2}")
print(f"Scalars c1={c1:.4f}, c2={c2:.4f}")
print(f"Linear Combination w: {w}")
# The result should be close to [4, 5] due to floating point representation
# Let's check if w is indeed [4, 5]
target_w = np.array([4, 5])
print(f"Is w approximately equal to [4, 5]? {np.allclose(w, target_w)}")
This code snippet demonstrates creating the vector w=[45] as a linear combination of v1=[21] and v2=[−13]. We found the specific scalars c1≈2.15 and c2≈0.31 that achieve this. (Finding these scalars involves solving a system of linear equations, a topic covered in Chapter 3).
Understanding linear combinations and span provides a geometric and algebraic framework for thinking about how vectors relate to each other and the spaces they generate. This foundation is important as we move towards analyzing the independence of vectors and the structure of feature spaces.
© 2025 ApX Machine Learning