While you can construct prompts by manually formatting strings with Python's f-strings, this approach becomes difficult to manage as application complexity grows. A more structured method is needed to handle multiple input variables, ensure reusability, and clearly separate the prompt's logic from the application code. LangChain provides PromptTemplate objects to solve this problem by formalizing the creation of prompts.
A PromptTemplate is a reproducible object for generating prompts. It contains a template string and a list of input variables that are expected when formatting the prompt.
The simplest PromptTemplate takes a template string and a list of variables. The template string uses Python's f-string syntax for its placeholders.
Let's create a template for a simple language translator.
from langchain_core.prompts import PromptTemplate
template_string = "Translate the following text from {input_language} to {output_language}: {text}"
prompt_template = PromptTemplate(
input_variables=["input_language", "output_language", "text"],
template=template_string,
)
# You can also use the convenient `from_template` class method
# prompt_template = PromptTemplate.from_template(template_string)
formatted_prompt = prompt_template.format(
input_language="English",
output_language="Spanish",
text="Hello, how are you?"
)
print(formatted_prompt)
# Output: Translate the following text from English to Spanish: Hello, how are you?
This formatted string can then be passed directly to an LLM. The main benefit here is clarity and safety. PromptTemplate will raise an error if you try to format it without providing all the input_variables, preventing incomplete prompts from being sent to the model.
The process of using a PromptTemplate to generate a final prompt for a model.
Chat models operate differently from standard LLMs. Instead of a single string, they expect a list of chat messages, each with a specific role (like system, human, or ai). To accommodate this, LangChain provides ChatPromptTemplate.
A ChatPromptTemplate is created from a list of message templates. This allows you to structure multi-turn conversations and assign distinct roles to different parts of the prompt.
Here is how you can create a ChatPromptTemplate for our translator.
from langchain_core.prompts import ChatPromptTemplate
chat_template = ChatPromptTemplate.from_messages(
[
("system", "You are an expert translator. Translate the user's text from {input_language} to {output_language}."),
("human", "{text}"),
]
)
formatted_messages = chat_template.format_messages(
input_language="English",
output_language="Japanese",
text="This is a test."
)
print(formatted_messages)
# Output:
# [
# SystemMessage(content='You are an expert translator. Translate the user's text from English to Japanese.'),
# HumanMessage(content='This is a test.')
# ]
The output is a list of SystemMessage and HumanMessage objects, which is the exact format a chat model like ChatOpenAI expects. This structured approach is significantly more organized and less error-prone than manually creating lists of message objects.
Sometimes, you may want to create a more specialized template from a general one by pre-filling some of its variables. This is known as "partialing" the prompt. It is useful for creating reusable components or when you know some input values ahead of time.
For example, we can take our general chat_template and create a dedicated English-to-German translator.
# Continuing from the previous example
# Create a specialized template by partialing the original
german_translator_template = chat_template.partial(
input_language="English",
output_language="German"
)
# Now, you only need to provide the remaining variable: `text`
german_messages = german_translator_template.format_messages(
text="That is a wonderful idea."
)
print(german_messages)
# Output:
# [
# SystemMessage(content="You are an expert translator. Translate the user's text from English to German."),
# HumanMessage(content='That is a wonderful idea.')
# ]
By using partial, we've created a new, more specific ChatPromptTemplate that only requires the text variable. This technique simplifies your code by reducing redundancy and allowing you to compose prompt templates in a modular way.
Using PromptTemplate and ChatPromptTemplate is fundamental to building scalable LangChain applications. They provide the structure needed to manage prompts effectively, making your code cleaner, more reusable, and easier to debug. As we move forward, you will see that these templates are the starting point for nearly every chain and agent you build.
Cleaner syntax. Built-in debugging. Production-ready from day one.
Built for the AI systems behind ApX Machine Learning
Was this section helpful?
© 2026 ApX Machine LearningEngineered with