requires_grad)backward()).grad)torch.nntorch.nn.Module Base Classtorch.nn losses)torch.optim)torch.utils.data.Datasettorchvision.transforms)torch.utils.data.DataLoaderThe Tensor is the core data structure for working with PyTorch. If you have experience with NumPy, you will find PyTorch Tensors quite familiar. Tensors are multi-dimensional arrays, very similar to NumPy's ndarray.
Think of tensors as generalizations of familiar mathematical objects:
[1, 2, 3]) is a 1-dimensional tensor (or rank-1 tensor).[[1, 2], [3, 4]]) is a 2-dimensional tensor (or rank-2 tensor).A view of tensors as generalizations of scalars, vectors, and matrices, increasing in dimensionality.
In the context of deep learning, tensors are used to represent virtually everything:
What makes PyTorch Tensors particularly suited for deep learning, compared to standard Python lists or even NumPy arrays?
Autograd system (which we'll cover in Chapter 3). This mechanism automatically calculates gradients, which are fundamental for training neural networks via backpropagation.While the concept is similar to NumPy arrays, these two features are what make PyTorch Tensors the foundation for building and training models efficiently. In the following sections, we will explore how to create and manipulate these essential data structures.
Was this section helpful?
© 2026 ApX Machine LearningEngineered with