Understanding the Simple Memory Node in n8n AI Agents
When building conversational AI agents in n8n, you'll quickly discover that without memory, your chatbot suffers from complete amnesia. Each message is treated as a brand new conversation. The Simple Memory node solves this problem.
The Problem: Stateless by Default
Large Language Models like GPT-4 don't inherently remember previous messages. Every API call is independent. Without intervention, this happens:
You: "My name is Piotr"
AI: "Nice to meet you, Piotr!"
You: "What's my name?"
AI: "I don't know your name."
Not exactly a great user experience.
The Solution: Memory Buffer Window
The Simple Memory node (memoryBufferWindow) stores recent conversation history and injects it into each new request. This gives the AI context about what was said before.
The key parameter is window size — the number of recent message exchanges to retain.
How Window Size Works
A common misconception is that window size = 1 means "remember only the current message." Actually, it means "keep the last 1 exchange in memory."
Here's the timeline with window size = 3:
| Turn | Message | Memory contains | Turn 1 remembered? |
|---|---|---|---|
| 1 | "My name is Piotr" | — | — |
| 2 | "What's 2+2?" | Turn 1 | ✅ |
| 3 | "Capital of France?" | Turns 1-2 | ✅ |
| 4 | "Color of the sky?" | Turns 1-3 | ✅ |
| 5 | "What's my name?" | Turns 2-4 | ❌ Gone |
With window size N, information from the first turn disappears at turn N + 2.
Choosing the Right Window Size
Larger windows mean more context but also:
- Higher token usage — each request sends more conversation history
- Increased costs — more tokens = higher API bills
- Potential context overflow — LLMs have maximum context limits
For simple Q&A bots, a window of 3-5 is usually sufficient. For complex multi-step tasks where users reference information from many turns ago, consider 10-20.
When You Don't Need Memory
Skip the Simple Memory node if your agent handles purely single-turn interactions — like a tool that answers one question and doesn't need follow-up context. Every other conversational use case benefits from it.
Conclusion
The Simple Memory node transforms your n8n AI Agent from a goldfish into a proper conversational partner. Just remember: window size defines how many exchanges are retained, not how many messages. Plan accordingly based on your typical conversation length and budget constraints.







