In the previous section, we saw that the dot product gives us a way to understand the angle between two vectors. A special and very useful case arises when this angle is exactly 90 degrees. This property is fundamental to many areas of machine learning and data analysis.
Two non-zero vectors are considered orthogonal if they are perpendicular to each other. In linear algebra, the test for orthogonality is simple and direct: their dot product is zero.
v⋅w=0If you calculate the dot product of two vectors and the result is 0, you know they are orthogonal. Geometrically, this means they point in directions that are at a right angle to one another. You can think of them as forming a perfect 'L' shape when placed tail-to-tail.
The vectors v and w are orthogonal because their dot product is (2×−3)+(3×2)=−6+6=0.
This relationship is a direct consequence of the geometric definition of the dot product: v⋅w=∥v∥∥w∥cos(θ). If both vectors have a non-zero length (∥v∥>0 and ∥w∥>0), the only way for their dot product to be zero is if cos(θ)=0. The cosine function is zero when the angle θ is 90°, confirming that the vectors are perpendicular.
In machine learning, orthogonality is a highly desirable property. When feature vectors are orthogonal, it implies that they are linearly independent. In simple terms, they represent distinct, non-redundant information. For example, if we were analyzing housing data, a vector representing "square footage" and a vector representing "number of bedrooms" are related. They are not orthogonal. However, a technique like Principal Component Analysis (PCA) works by finding a new set of orthogonal vectors that represent the most significant information in the data, which can simplify a model and improve its performance.
We can easily check for orthogonality in Python using NumPy's dot() function. Let's define the two vectors from our plot and calculate their dot product.
import numpy as np
# Define two orthogonal vectors
v = np.array([2, 3])
w = np.array([-3, 2])
# Calculate the dot product
dot_product = np.dot(v, w)
print(f"Vector v: {v}")
print(f"Vector w: {w}")
print(f"Dot product of v and w: {dot_product}")
Output:
Vector v: [2 3]
Vector w: [-3 2]
Dot product of v and w: 0
As expected, the result is 0, confirming that v and w are orthogonal.
Now let's try with two vectors that are not orthogonal.
# Define two non-orthogonal vectors
p = np.array([1, 5])
q = np.array([2, 4])
# Calculate the dot product
dot_product_pq = np.dot(p, q)
print(f"Vector p: {p}")
print(f"Vector q: {q}")
print(f"Dot product of p and q: {dot_product_pq}")
Output:
Vector p: [1 5]
Vector q: [2 4]
Dot product of p and q: 22
The result is 22, a non-zero value, which tells us these vectors are not perpendicular.
When working with computers, calculations involving floating-point numbers (numbers with decimals) can sometimes have tiny precision errors. You might encounter a situation where two vectors should be orthogonal, but their dot product results in a very small number like 1.4e-16 instead of exactly 0.
This is normal and happens because of how computers store these numbers. For this reason, it is bad practice to check for orthogonality by testing if the result is exactly zero.
# A vector and its orthogonal counterpart
a = np.array([1/3, 2/3])
b = np.array([-2, 1])
dot_product_ab = np.dot(a, b)
print(f"Dot product: {dot_product_ab}")
# This check might fail!
is_orthogonal = (dot_product_ab == 0)
print(f"Are they orthogonal? (Direct check): {is_orthogonal}")
Output:
Dot product: -2.220446049250313e-16
Are they orthogonal? (Direct check): False
The correct approach is to check if the dot product is close to zero within a small tolerance. NumPy provides the np.isclose() function for this purpose.
# The correct way to check for orthogonality
is_orthogonal_close = np.isclose(dot_product_ab, 0)
print(f"Are they orthogonal? (Using np.isclose): {is_orthogonal_close}")
Output:
Are they orthogonal? (Using np.isclose): True
Using np.isclose() makes your code more reliable when dealing with the limitations of floating-point arithmetic. This habit will save you from unexpected behavior in your machine learning projects.
Was this section helpful?
dot function, essential for performing dot product calculations between vectors and matrices in Python.© 2026 ApX Machine LearningEngineered with