Prerequisites: No prior knowledge
Level:
LLM Fundamentals
Define what a Large Language Model (LLM) is and its basic function.
Model Parameters
Explain what model parameters are and how they relate to LLM size.
Hardware Components
Identify the main hardware components (CPU, GPU, RAM, VRAM) involved in running LLMs.
Size-Hardware Relationship
Describe the connection between the number of parameters in an LLM and the required hardware resources.
Inference vs. Training Needs
Distinguish between the hardware demands for running LLM inference versus training (at a high level).
Basic Requirement Estimation
Perform simple estimations for memory (VRAM) based on model size.