Up to this point, our interactions with language models have been transactional. We send a request and receive a response, with the model having no recollection of previous exchanges. This stateless nature presents a significant challenge when building applications that require continuous dialogue, such as chatbots or assistants. Without a mechanism to retain context, a conversation is just a series of disconnected questions and answers.
This chapter introduces the concept of memory to build stateful, conversational applications. We will use the memory module to manage and persist conversation history, allowing your applications to maintain context across multiple turns.
You will learn to implement several memory strategies:
7.1 The Challenge of Stateful Conversations
7.2 Implementing Conversation Buffer Memory
7.3 Using Summary Memory for Long Conversations
7.4 Tracking Entities Across a Conversation
© 2026 ApX Machine LearningEngineered with