Agents frequently encounter situations where information is not static but arrives as a continuous flow. News feeds, market data, sensor readings, or social media updates are all examples of dynamic information streams. Effectively prompting an agent to process, understand, and act upon such evolving data is a common challenge. This hands-on exercise demonstrates how to design prompts that help an agent manage its working memory and maintain context when dealing with a sequence of incoming information. We will focus on techniques that allow an agent to iteratively update its understanding, summarize new inputs, and sustain focus on its task.
Imagine we need an AI agent to monitor a stream of short text snippets (like simplified news headlines or social media posts) and maintain a running summary of events related to a specific topic, for instance, "advancements in local renewable energy projects." The agent needs to ingest each new piece of information, integrate it with what it has already learned, and update its summary concisely.
The core of this approach lies in structuring the prompt to carry the agent's current "state" or "memory" from one interaction to the next. The agent's output from processing one piece of information becomes part of the input for processing the next.
Let's design an initial prompt for our Event Tracker agent. This prompt will set the agent's role, its objective, and provide placeholders for its current understanding (the summary) and the new piece of information.
You are an AI assistant tasked with monitoring and summarizing news snippets about "advancements in local renewable energy projects."
Your goal is to maintain a concise, up-to-date summary of key developments.
Current Summary of Developments:
{current_summary}
New Information Snippet:
"{new_snippet}"
Your Task:
1. Analyze the "New Information Snippet."
2. If it's relevant to "advancements in local renewable energy projects" and provides new information not already covered or a significant update, integrate it into the "Current Summary of Developments."
3. If it's irrelevant, or redundant, or a very minor update, you can indicate that no significant change to the summary is needed.
4. Provide the updated summary. If no change, return the existing summary. The summary should remain brief, focusing only on major points.
Updated Summary:
Here, {current_summary}
will initially be "None reported yet." or an empty string. {new_snippet}
will be the incoming piece of text. The agent's response, specifically the "Updated Summary" part, will then become the {current_summary}
for the next iteration.
Let's simulate a stream of information and see how the agent, guided by our prompt structure, might process it.
Iteration 1:
current_summary
: "None reported yet."new_snippet
: "Oakville town council approves pilot program for solar panel installations on municipal buildings."Prompt for Iteration 1:
You are an AI assistant tasked with monitoring and summarizing news snippets about "advancements in local renewable energy projects."
Your goal is to maintain a concise, up-to-date summary of key developments.
Current Summary of Developments:
None reported yet.
New Information Snippet:
"Oakville town council approves pilot program for solar panel installations on municipal buildings."
Your Task:
1. Analyze the "New Information Snippet."
2. If it's relevant to "advancements in local renewable energy projects" and provides new information not already covered or a significant update, integrate it into the "Current Summary of Developments."
3. If it's irrelevant, or redundant, or a very minor update, you can indicate that no significant change to the summary is needed.
4. Provide the updated summary. If no change, return the existing summary. The summary should remain brief, focusing only on major points.
Updated Summary:
Agent's Potential Response (Content for next current_summary
):
"Oakville town council approved a pilot program for solar panels on municipal buildings."
Iteration 2:
current_summary
: "Oakville town council approved a pilot program for solar panels on municipal buildings."new_snippet
: "The new geothermal plant in Springfield began operations this week, promising to power 500 homes."Prompt for Iteration 2 (snippet):
...
Current Summary of Developments:
Oakville town council approved a pilot program for solar panels on municipal buildings.
New Information Snippet:
"The new geothermal plant in Springfield began operations this week, promising to power 500 homes."
...
Updated Summary:
Agent's Potential Response: "Key developments: Oakville approved a solar panel pilot for municipal buildings. Springfield's new geothermal plant started operations, aiming to power 500 homes."
Notice how the agent is prompted to integrate new information. This process requires the agent to perform a type of information condensation, as discussed earlier in this chapter, ensuring the summary remains concise.
Iteration 3:
current_summary
: "Key developments: Oakville approved a solar panel pilot for municipal buildings. Springfield's new geothermal plant started operations, aiming to power 500 homes."new_snippet
: "Weather forecast for Oakville: Sunny with a high of 25°C."Prompt for Iteration 3 (snippet):
...
Current Summary of Developments:
Key developments: Oakville approved a solar panel pilot for municipal buildings. Springfield's new geothermal plant started operations, aiming to power 500 homes.
New Information Snippet:
"Weather forecast for Oakville: Sunny with a high of 25°C."
...
Updated Summary:
Agent's Potential Response: "Key developments: Oakville approved a solar panel pilot for municipal buildings. Springfield's new geothermal plant started operations, aiming to power 500 homes. (New snippet about Oakville weather is not relevant to renewable energy advancements)."
Or, if programmed to be more succinct in its self-reflection part of the output: "Key developments: Oakville approved a solar panel pilot for municipal buildings. Springfield's new geothermal plant started operations, aiming to power 500 homes."
This iterative process, where the agent's memory (the summary) is explicitly managed within the prompt, allows it to handle dynamic information streams effectively within the confines of its context window.
The process can be visualized as a loop:
An iterative loop for processing dynamic information streams. The agent's understanding, carried as the "Current Summary," is refined with each new snippet.
Here's how you might implement this in Python-like pseudocode, assuming you have a function call_llm(prompt_text)
that interacts with your language model:
def process_dynamic_stream(initial_prompt_template, information_snippets):
current_summary = "None reported yet."
all_summaries = []
for snippet_index, snippet in enumerate(information_snippets):
# Construct the prompt for the current iteration
prompt = initial_prompt_template.format(
current_summary=current_summary,
new_snippet=snippet
)
print(f"\n--- Iteration {snippet_index + 1} ---")
print(f"Feeding snippet: {snippet}")
# print(f"Prompt sent to LLM:\n{prompt}") # For debugging
# Call the LLM
# In a real system, you'd parse the LLM's full response
# to extract just the "Updated Summary" part.
# For simplicity, we assume the LLM returns the updated summary directly
# or that we have a robust parser.
llm_response_text = call_llm(prompt) # This is a placeholder for your LLM call
# A simple way to extract the summary if the LLM follows instructions
# This might need to be more robust in practice.
if "Updated Summary:" in llm_response_text:
updated_summary_text = llm_response_text.split("Updated Summary:")[1].strip()
else:
# Fallback or error handling
updated_summary_text = current_summary # Or log an error
current_summary = updated_summary_text
all_summaries.append(current_summary)
print(f"Agent's Updated Summary: {current_summary}")
return all_summaries
# Example Usage:
prompt_template = """You are an AI assistant tasked with monitoring and summarizing news snippets about "advancements in local renewable energy projects."
Your goal is to maintain a concise, up-to-date summary of key developments.
Current Summary of Developments:
{current_summary}
New Information Snippet:
"{new_snippet}"
Your Task:
1. Analyze the "New Information Snippet."
2. If it's relevant to "advancements in local renewable energy projects" and provides new information not already covered or a significant update, integrate it into the "Current Summary of Developments."
3. If it's irrelevant, or redundant, or a very minor update, you can indicate that no significant change to the summary is needed.
4. Provide the updated summary. If no change, return the existing summary. The summary should remain brief, focusing only on major points.
Updated Summary:
"""
simulated_snippets = [
"Oakville town council approves pilot program for solar panel installations on municipal buildings.",
"The new geothermal plant in Springfield began operations this week, promising to power 500 homes.",
"Local bakery wins award for best croissants.",
"Researchers in Pineview announce a breakthrough in battery storage efficiency for solar energy."
]
# This is a hypothetical LLM call function
def call_llm(prompt_text):
# In a real application, this function would make an API call to an LLM.
# For this pseudocode, we'll simulate a response based on the prompt.
# This simulation is highly simplified.
if "Oakville town council approves" in prompt_text and "None reported yet" in prompt_text:
return "Updated Summary: Oakville town council approved a solar panel pilot for municipal buildings."
elif "geothermal plant in Springfield" in prompt_text:
if "Oakville town council approved" in prompt_text:
return "Updated Summary: Oakville approved solar pilot. Springfield's new geothermal plant operational (powers 500 homes)."
else:
return "Updated Summary: Springfield's new geothermal plant started operations, aiming to power 500 homes."
elif "best croissants" in prompt_text:
# Extract previous summary to return it if snippet is irrelevant
summary_marker = "Current Summary of Developments:\n"
current_summary_start = prompt_text.find(summary_marker) + len(summary_marker)
current_summary_end = prompt_text.find("\n\nNew Information Snippet:")
prev_summary = prompt_text[current_summary_start:current_summary_end].strip()
return f"Updated Summary: {prev_summary} (New snippet about croissants is not relevant)."
elif "battery storage efficiency" in prompt_text:
return "Updated Summary: Oakville: solar pilot. Springfield: geothermal plant. Pineview: breakthrough in battery storage for solar."
return "Updated Summary: No changes to summary."
# Run the simulation
# final_summaries = process_dynamic_stream(prompt_template, simulated_snippets)
# print("\n--- Final list of summaries after each step ---")
# for i, summary in enumerate(final_summaries):
# print(f"After snippet {i+1}: {summary}")
To run this example, you would replace call_llm(prompt)
with an actual call to a language model. The simulated call_llm
above is just for illustrative purposes of how the summary might evolve.
current_summary
doesn't exceed context window limits or become unwieldy.current_summary
can help.This hands-on exercise demonstrates a foundational method for enabling AI agents to process dynamic information. By thoughtfully structuring your prompts to manage a rolling state or summary, you can guide agents to perform effectively in environments where data is constantly evolving. Remember that iterating on your prompt design based on observed agent behavior is a standard part of developing reliable agentic systems.
Was this section helpful?
© 2025 ApX Machine Learning