Learn the fundamentals of Large Language Model (LLM) sizes, measured in parameters, and how they relate to hardware components like GPUs, CPUs, and memory (RAM/VRAM). This course explains the hardware needed for running and interacting with different sizes of LLMs.
Prerequisites: No prior knowledge required
Level: Beginner
LLM Fundamentals
Define what a Large Language Model (LLM) is and its basic function.
Model Parameters
Explain what model parameters are and how they relate to LLM size.
Hardware Components
Identify the main hardware components (CPU, GPU, RAM, VRAM) involved in running LLMs.
Size-Hardware Relationship
Describe the connection between the number of parameters in an LLM and the required hardware resources.
Inference vs. Training Needs
Distinguish between the hardware demands for running LLM inference versus training (at a high level).
Basic Requirement Estimation
Perform simple estimations for memory (VRAM) based on model size.
© 2025 ApX Machine Learning