As we saw earlier, interacting with Large Language Models often involves sending them carefully constructed text known as prompts. While you can directly write these prompts as strings in your Python code, this approach quickly becomes cumbersome and difficult to manage, especially as applications grow in complexity. Imagine needing the same basic instruction format but with different inputs each time. Hardcoding these variations leads to repetition and makes updates challenging.
LangChain provides a clean solution to this problem through Prompt Templates. A prompt template is essentially a pre-defined recipe for generating prompts. It contains a text string with placeholders for dynamic values that will be filled in later. This allows you to separate the fixed structure of your prompt from the variable parts, promoting reusability and maintainability.
The core class for handling text-based prompts in LangChain is PromptTemplate
. Let's see how to use it.
First, you need to import the class:
from langchain.prompts import PromptTemplate
Next, define your template string. Use curly braces {}
to indicate placeholders for variables that will be inserted later. Let's create a template to generate a product description summary based on a product name and its features:
template_string = """
Summarize the main features of the product '{product_name}'.
Focus on these aspects: {features}.
Provide a concise summary suitable for an e-commerce listing.
"""
Now, create an instance of PromptTemplate
. You need to provide two main arguments:
input_variables
: A list of strings specifying the names of the variables used in the template string (without the curly braces).template
: The template string itself.prompt_template = PromptTemplate(
input_variables=["product_name", "features"],
template=template_string
)
print(prompt_template)
This will output information about the template object, confirming the input variables it expects and showing the template structure.
Once you have a PromptTemplate
object, you can generate a complete prompt string by providing values for the input variables. This is done using the .format()
method, passing the values as keyword arguments:
# Define the specific values for this instance
product = "Quantum Laptop X"
product_features = "16-inch OLED display, 12-core CPU, 32GB RAM, 1TB SSD, 20-hour battery life"
# Format the prompt
formatted_prompt = prompt_template.format(product_name=product, features=product_features)
print(formatted_prompt)
The output will be the fully formed prompt string, ready to be sent to an LLM:
Summarize the main features of the product 'Quantum Laptop X'.
Focus on these aspects: 16-inch OLED display, 12-core CPU, 32GB RAM, 1TB SSD, 20-hour battery life.
Provide a concise summary suitable for an e-commerce listing.
Using PromptTemplate
offers several advantages over manually constructing prompt strings:
While we've focused on creating and formatting the template, the ultimate goal is to use this formatted prompt with an LLM. As introduced previously, LangChain provides interfaces for various language models (like LLM
or ChatModel
wrappers). A PromptTemplate
object integrates smoothly into these workflows.
Here's a simplified example showing how a formatted prompt would typically be used (assuming you have an LLM object named llm
already configured, as discussed in the "Core Concepts" section):
# Assume 'llm' is an initialized LangChain LLM interface
# from langchain_openai import OpenAI # Example import
# llm = OpenAI(api_key="YOUR_API_KEY") # Example initialization
# 1. Define the template
template_string = "Translate the following English text to French: {english_text}"
prompt_template = PromptTemplate(
input_variables=["english_text"],
template=template_string
)
# 2. Format the prompt with specific input
formatted_prompt = prompt_template.format(english_text="Hello, how are you?")
# 3. Send the formatted prompt to the LLM
# response = llm.invoke(formatted_prompt) # Using invoke for newer LangChain versions
# print(response)
# Expected output might be: "Bonjour, comment ça va ?"
This structure, where a template formats input that is then passed to a model, is fundamental to building more complex sequences, known as Chains, which we will explore in the next chapter.
Creating good prompt templates is part art, part science. Here are a few guidelines:
Mastering prompt templates is a foundational step in effectively using LangChain. They provide the structure needed to reliably guide LLMs and are a building block for creating sophisticated applications.
© 2025 ApX Machine Learning