The foundational architecture of the Model Context Protocol defines how components communicate, but the value of that communication lies in the data being exchanged. With the transport layer and connection topology established, the focus shifts to the content payloads that serve as context for the Large Language Model (LLM). This chapter examines the two primary primitives used to expose data and interaction patterns: Resources and Prompts.
Resources function as the standard interface for reading data from a server. They allow you to expose files, database records, or API responses as addressable content. You will define Uniform Resource Identifiers (URIs) to locate these items and implement the logic required to retrieve them. The implementation details cover both synchronous reads, where data is fetched immediately, and the subscription model, which enables the server to push notifications when the underlying data source changes.
Prompts offer a method to standardize user interactions. Instead of relying on ad-hoc instructions, you can define structured templates that guide the LLM's behavior. This section demonstrates how to construct Prompt schemas that specify required arguments and embed the necessary context automatically.
By the end of this chapter, you will move from theoretical definitions to working code. You will construct a file reader server that implements these primitives, enabling an MCP client to access and read content from your local directory structure.
2.1 Concept of Resources in MCP
2.2 URI Schemes and Patterns
2.3 Implementing Synchronous Resource Reads
2.4 Resource Subscriptions and Notifications
2.5 Structuring Prompts
2.6 Hands-on Practice: Building a File Reader Server
© 2026 ApX Machine LearningEngineered with