Having successfully installed PyTorch and set up your environment, let's turn our attention to the core data structure you'll be working with: the Tensor. If you have experience with NumPy, you'll find PyTorch Tensors quite familiar. At their heart, tensors are multi-dimensional arrays, very similar to NumPy's ndarray
.
Think of tensors as generalizations of familiar mathematical objects:
[1, 2, 3]
) is a 1-dimensional tensor (or rank-1 tensor).[[1, 2], [3, 4]]
) is a 2-dimensional tensor (or rank-2 tensor).A conceptual view of tensors as generalizations of scalars, vectors, and matrices, increasing in dimensionality.
In the context of deep learning, tensors are used to represent virtually everything:
What makes PyTorch Tensors particularly suited for deep learning, compared to standard Python lists or even NumPy arrays?
Autograd
system (which we'll cover in Chapter 3). This mechanism automatically calculates gradients, which are fundamental for training neural networks via backpropagation.While the concept is similar to NumPy arrays, these two features are what make PyTorch Tensors the foundation for building and training models efficiently. In the following sections, we will explore how to create and manipulate these essential data structures.
© 2025 ApX Machine Learning