Home
Blog
Courses
LLMs
EN
All Courses
Understanding LLM Model Sizes and Hardware Requirements
Chapter 1: Introduction to Large Language Models and Size
What is a Large Language Model (LLM)?
Understanding Model Parameters
How Model Size is Measured
Examples of Different Model Sizes
Quiz for Chapter 1
Chapter 2: Essential Hardware Components for AI
The Central Processing Unit (CPU)
Random Access Memory (RAM)
The Graphics Processing Unit (GPU)
Video RAM (VRAM)
Brief Overview of TPUs
Quiz for Chapter 2
Chapter 3: Connecting Model Size to Hardware Needs
Model Parameters and Memory Consumption
Data Types and Precision (FP16, INT8)
Introduction to Quantization
Compute Requirements (FLOPS)
Memory Bandwidth Importance
Quiz for Chapter 3
Chapter 4: Running LLMs: Inference vs. Training
What is Model Inference?
Hardware Needs for Inference
What is Model Training?
Hardware Needs for Training
Focus on Inference Requirements
Quiz for Chapter 4
Chapter 5: Estimating Hardware Needs
Rule of Thumb: Parameters to VRAM
Accounting for Activation Memory
Factors Influencing Actual Usage
Checking Hardware Specifications
Practice: Simple VRAM Estimations
Quiz for Chapter 5
Memory Bandwidth Importance
Was this section helpful?
Helpful
Report Issue
Mark as Complete
© 2025 ApX Machine Learning
LLM Memory Bandwidth Importance