torch.distributed.tensor.parallel, PyTorch Developers, 2025 (PyTorch) - Official documentation for PyTorch's native high-level API for tensor model parallelism, offering practical implementation guidance for this advanced technique within the framework.
torch.distributed package - Distributed communication, PyTorch Developers, 2024 - Comprehensive guide to the low-level communication primitives (e.g., all-reduce, gather, broadcast) in PyTorch's distributed package, which are the building blocks for implementing tensor model parallelism.