MAGMA: The Memory Revolution
Metadata
Article Title
MAGMA: A Multi-Graph based Agentic Memory Architecture for AI Agents
Source
Author
Dongming Jiang, Yi Li, Guanpeng Li, Bingzhe Li
Date
2026-01
Article Summary
This paper introduces MAGMA, a revolutionary memory architecture for AI agents that solves the fundamental problem of long-horizon reasoning. Unlike existing systems that store memories in a single "bucket" using only semantic similarity, MAGMA organizes memories across four distinct graphs—semantic, temporal, causal, and entity—enabling AI agents to answer not just "what happened" but also "why" and "when" questions with unprecedented accuracy.
Technical Details
Topic
AI Memory Architecture
Time Span
2026 (Research)
Narrative Approach
Problem-Solution
Recommended Style
tech
Recommended Layout
dense
Aspect Ratio
3:4
Language
en
Page Count
8
Generated
2026-01-19 16:50
MAGMA: The Memory Revolution - Knowledge Comic Storyboard
Character Reference: characters-problem-solution/characters.png
Cover
Filename: 00-cover-magma-memory-revolution.png Core Message: AI memory, reimagined across four dimensions
Visual Design:
Title: "MAGMA" in bold blue neon text with circuit patterns
Main visual: A glowing brain made of four interweaving graph networks (cyan, green, orange, purple)
Subtitle: "How AI Agents Learn to Remember"
Background: Dark cyber grid with floating memory nodes
Visual Prompt: Tech-style comic book cover. Title "MAGMA" in electric blue neon typography at top. Central image: a stylized AI brain composed of four distinct colored graph networks interweaving - cyan (semantic), green (temporal), orange (causal), purple (entity). The networks glow and pulse with energy. Background is deep navy with subtle grid lines and floating glowing nodes. Subtitle at bottom: "How AI Agents Learn to Remember". Clean lines, modern digital aesthetic, cinematic lighting.
Page 1 / 8
Filename: 01-page-memory-problem.png Layout: dense Narrative Layer: Problem introduction Core Message: Current AI agents have a serious memory problem
Section Summary
What this section covers: The fundamental limitation of Large Language Models - they can only "see" a fixed window of conversation and lose important context over time. This is called the "Lost in the Middle" phenomenon.
Key takeaway: Even the most advanced AI models struggle to maintain coherent memory across long conversations.
Panel Layout
Panel Count: 4 Layout Type: grid
Panel 1 (Size: 1/2 page, Position: Top)
Scene: Abstract representation of an AI agent trying to recall past conversations Image Description:
Camera angle: Wide shot
Characters: A sleek robot/AI avatar surrounded by floating speech bubbles, some fading away
Environment: Digital void with scattered memory fragments
Lighting: Dramatic spotlight on the confused AI
Color tone: Dark blues with fading grays
Text Elements:
Narrator box: "Large Language Models have a problem..."
Caption: "They forget."
Panel 2 (Size: 1/4 page, Position: Bottom-left)
Scene: Visual representation of context window Image Description:
Camera angle: Technical diagram view
Visual: A sliding window over a long conversation thread, only a portion visible
Lighting: Technical, flat
Text Elements:
Narrator box: "They can only 'see' a fixed window of conversation"
Panel 3 (Size: 1/4 page, Position: Bottom-right)
Scene: "Lost in the Middle" phenomenon Image Description:
Camera angle: Eye level
Visual: Important information trapped in the middle of a long document, grayed out
Lighting: Spotlight on edges, darkness in middle
Text Elements:
Caption: "THE LOST-IN-THE-MIDDLE PROBLEM"
Page Hook: "But what if we could give AI agents a real memory?"
Visual Prompt: Tech-style comic page showing AI memory problems. Top half: A sleek humanoid AI robot surrounded by floating conversation bubbles, some fading into transparency, looking confused in a dark digital void. Bottom left: Technical diagram showing a sliding "context window" over a long text stream, only a small portion highlighted. Bottom right: A document with bright edges but dark middle section, showing the "lost in the middle" problem. Dark cyber aesthetic, neon blue accents, grid background. Caption boxes with white text.
Page 2 / 8
Filename: 02-page-mag-systems.png Layout: dense Narrative Layer: Background context Core Message: Memory-Augmented Generation tries to solve this, but has limitations
Section Summary
What this section covers: Memory-Augmented Generation (MAG) systems emerged to help LLMs remember by storing and retrieving past interactions. However, most existing systems just dump everything into one bucket and rely on simple similarity matching.
Key takeaway: Current memory systems are too simplistic - they can tell you WHAT happened but struggle with WHY things happened.
Panel Layout
Panel Count: 3 Layout Type: vertical flow
Panel 1 (Size: 1/3 page, Position: Top)
Scene: Introduction to Memory-Augmented Generation Image Description:
Camera angle: Technical schematic view
Visual: Feedback loop diagram - Query → Memory → LLM → Output → Back to Memory
Lighting: Clean, technical
Text Elements:
Narrator box: "Memory-Augmented Generation (MAG) gives LLMs external memory"
Labels on diagram: "Query", "Retrieve", "Generate", "Store"
Panel 2 (Size: 1/3 page, Position: Middle)
Scene: Problem with current MAG systems Image Description:
Camera angle: 3D perspective
Visual: A messy, tangled ball of memory nodes connected randomly
Characters: AI agent looking confused at the tangled mess
Lighting: Harsh, revealing chaos
Text Elements:
Narrator box: "But most systems store memories in a monolithic blob..."
Caption: "Finding the right memory = finding a needle in a haystack"
Panel 3 (Size: 1/3 page, Position: Bottom)
Scene: The fundamental question Image Description:
Camera angle: Close-up
Visual: Two query bubbles - "WHAT happened?" (bright) vs "WHY did it happen?" (dim)
Lighting: Contrast between bright and dim
Text Elements:
Narrator box: "They can answer WHAT happened..."
Caption: "...but struggle to answer WHY"
Page Hook: "MAGMA proposes a radical new approach"
Visual Prompt: Tech-style comic page about Memory-Augmented Generation. Top third: Clean technical diagram showing MAG feedback loop with arrows connecting Query → Memory Database → LLM → Output → back to Memory. Middle third: Chaotic visualization of tangled memory nodes like spaghetti, with a confused AI robot looking at the mess. Bottom third: Split panel showing "WHAT?" in bright neon vs "WHY?" in dim gray, representing the limitation of current systems. Dark cyber aesthetic with circuit patterns.
Page 3 / 8
Filename: 03-page-magma-intro.png Layout: splash Narrative Layer: Solution reveal Core Message: Introducing MAGMA - a multi-graph architecture
Section Summary
What this section covers: MAGMA's key innovation - instead of one messy memory blob, it organizes memories across four orthogonal graphs, each capturing a different type of relationship.
Key takeaway: The secret to better memory is STRUCTURE - organizing information by how it relates, not just what it is.
Panel Layout
Panel Count: 2 Layout Type: splash + inset
Panel 1 (Size: Full page)
Scene: The MAGMA architecture reveal Image Description:
Camera angle: Dramatic wide shot
Visual: Four glowing graph structures arranged in a crystalline formation, each a different color
Environment: Dark space with the graphs as the only light source
Lighting: Each graph emits its own colored glow
Color tone: Cyan (semantic), Emerald (temporal), Orange (causal), Purple (entity)
Text Elements:
Title: "MAGMA"
Subtitle: "Multi-Graph Agentic Memory Architecture"
Panel 2 (Size: Inset corner)
Scene: Quick comparison Image Description:
Visual: Before/After - tangled mess → organized crystal structure
Text Elements:
Caption: "From chaos to clarity"
Page Hook: "Four graphs. Four dimensions of understanding."
Visual Prompt: Dramatic splash page revealing MAGMA architecture. Full page: Four magnificent glowing graph structures arranged in a geometric crystalline formation against a dark cosmic background. Semantic graph glows cyan blue, Temporal graph glows emerald green, Causal graph glows orange, Entity graph glows purple. Each graph is a network of nodes and edges, but each has a distinct structure - semantic is clustered, temporal is linear chain, causal has directed arrows, entity has central hubs. Title "MAGMA" in bold white text. Small inset at corner showing before (tangled mess) vs after (organized crystal). Epic, cinematic, tech aesthetic.
Page 4 / 8
Filename: 04-page-four-graphs.png Layout: grid Narrative Layer: Detailed explanation Core Message: Understanding each of the four graphs
Section Summary
What this section covers: Deep dive into each of MAGMA's four graphs - what they capture and how they help with different types of queries.
Key takeaway: Different questions need different types of memory structure. "When" questions need temporal graphs, "Why" questions need causal graphs.
Panel Layout
Panel Count: 4 Layout Type: 2x2 grid
Panel 1 (Size: 1/4 page, Position: Top-left)
Scene: Semantic Graph Image Description:
Visual: Clustered nodes connected by similarity, like constellations
Color: Cyan blue glow
Lighting: Soft glow emanating from clusters
Text Elements:
Label: "SEMANTIC GRAPH"
Caption: "Connects similar concepts"
Example: "'Machine learning' links to 'AI', 'neural networks'"
Panel 2 (Size: 1/4 page, Position: Top-right)
Scene: Temporal Graph Image Description:
Visual: Linear chain of nodes like a timeline
Color: Emerald green
Lighting: Flowing light along the chain
Text Elements:
Label: "TEMPORAL GRAPH"
Caption: "Tracks the order of events"
Example: "Monday → Tuesday → Wednesday"
Panel 3 (Size: 1/4 page, Position: Bottom-left)
Scene: Causal Graph Image Description:
Visual: Directed arrows showing cause → effect
Color: Orange/amber
Lighting: Sharp, directional
Text Elements:
Label: "CAUSAL GRAPH"
Caption: "Models cause and effect"
Example: "'Bug found' → 'Code fix' → 'Test passed'"
Panel 4 (Size: 1/4 page, Position: Bottom-right)
Scene: Entity Graph Image Description:
Visual: Central hub nodes with connections to events
Color: Purple
Lighting: Radial glow from hubs
Text Elements:
Label: "ENTITY GRAPH"
Caption: "Tracks people, places, things"
Example: "'Alice' appears in conversation 1, 5, and 12"
Page Hook: "But how does MAGMA know which graph to use?"
Visual Prompt: 4-panel grid explaining MAGMA's four graphs. Top-left: Semantic graph in cyan blue - clustered nodes with undirected edges, like constellations. Top-right: Temporal graph in emerald green - linear chain of nodes connected sequentially like a timeline. Bottom-left: Causal graph in orange - directed arrows showing cause-to-effect relationships with arrow heads. Bottom-right: Entity graph in purple - hub-and-spoke pattern with central entity nodes connected to event nodes. Each panel has clear label and example caption. Dark background, glowing neon style, tech aesthetic.
Page 5 / 8
Filename: 05-page-adaptive-traversal.png Layout: cinematic Narrative Layer: Core mechanism Core Message: Intent-aware query routing
Section Summary
What this section covers: MAGMA's Adaptive Traversal Policy - a smart router that analyzes your query, figures out what type of question it is (Why/When/Entity), and prioritizes the right graph for traversal.
Key takeaway: MAGMA is like a GPS for memory - it finds the optimal path through the right dimension of memory based on your question type.
Panel Layout
Panel Count: 3 Layout Type: horizontal panels
Panel 1 (Size: 1/3 page, Position: Top)
Scene: Query analysis Image Description:
Camera angle: Technical view
Visual: A query entering a "router" machine that classifies it
Labels: Intent → "Why", "When", or "Entity"
Lighting: Scanning light effect
Text Elements:
Narrator box: "Step 1: MAGMA analyzes your query"
Caption: "Intent Classification"
Panel 2 (Size: 1/3 page, Position: Middle)
Scene: Graph selection and traversal Image Description:
Camera angle: Bird's eye
Visual: A path lighting up through one of the four graphs
The selected graph glows brighter while others dim
Lighting: Spotlight on active path
Text Elements:
Narrator box: "Step 2: It selects and traverses the right graph"
Caption: "Policy-Guided Beam Search"
Panel 3 (Size: 1/3 page, Position: Bottom)
Scene: Context synthesis Image Description:
Camera angle: Wide shot
Visual: Retrieved nodes being assembled into a coherent narrative
Lighting: Constructive, building energy
Text Elements:
Narrator box: "Step 3: Retrieved memories become coherent context"
Caption: "Structured Narrative Synthesis"
Page Hook: "Memory isn't just stored - it evolves"
Visual Prompt: Cinematic three-panel page showing MAGMA's query process. Top panel: Query text entering a high-tech router cube that scans and classifies it into "WHY?", "WHEN?", or "ENTITY?" categories. Middle panel: Bird's eye view of four graph networks, one is glowing bright (selected) while others are dimmed, a glowing path traces through the selected graph. Bottom panel: Retrieved memory nodes floating and assembling together like puzzle pieces into a coherent document. Tech aesthetic, neon highlights, dark background with grid patterns.
Page 6 / 8
Filename: 06-page-memory-evolution.png Layout: split Narrative Layer: System design Core Message: Dual-stream memory evolution
Section Summary
What this section covers: MAGMA's innovative dual-stream approach to memory updates - a Fast Path for immediate recording and a Slow Path for deep structural analysis. This mirrors how human brains process memories.
Key takeaway: Great memory systems need both speed (to stay responsive) and depth (to build understanding over time).
Panel Layout
Panel Count: 2 main panels Layout Type: vertical split
Panel 1 (Size: 1/2 page, Position: Left)
Scene: Fast Path - Synaptic Ingestion Image Description:
Camera angle: Dynamic action shot
Visual: Lightning-fast data stream entering memory, nodes being added instantly
Color: Electric blue with speed lines
Action: Quick, non-blocking
Text Elements:
Label: "FAST PATH"
Caption: "Immediate event recording"
Detail: "No blocking - stays responsive"
Panel 2 (Size: 1/2 page, Position: Right)
Scene: Slow Path - Structural Consolidation Image Description:
Camera angle: Contemplative, slower feel
Visual: A background process carefully analyzing connections, adding causal links
Color: Warm amber, thoughtful
Action: Deep analysis, connecting dots
Text Elements:
Label: "SLOW PATH"
Caption: "Deep structural analysis"
Detail: "Builds causal and entity links over time"
Page Hook: "Does this actually work? Let's see the results."
Visual Prompt: Split vertical page showing dual-stream memory evolution. Left side - FAST PATH: Electric blue lightning-speed data streams entering a memory graph, nodes appearing instantly, speed lines, dynamic energy, label "Synaptic Ingestion - Immediate Recording". Right side - SLOW PATH: Warm amber glow, contemplative scene of an AI carefully analyzing a memory subgraph, drawing new connection lines between nodes, thoughtful pace, label "Structural Consolidation - Deep Analysis". Dark background, tech aesthetic, contrast between speed and depth.
Page 7 / 8
Filename: 07-page-results.png Layout: mixed Narrative Layer: Validation Core Message: MAGMA outperforms all baselines
Section Summary
What this section covers: Experimental results showing MAGMA achieving 70% accuracy compared to 48-59% for competing systems. It particularly excels at temporal and adversarial reasoning tasks.
Key takeaway: The multi-graph approach isn't just elegant theory - it delivers measurable improvements in real-world reasoning tasks.
Panel Layout
Panel Count: 3 Layout Type: mixed
Panel 1 (Size: 1/2 page, Position: Top)
Scene: Results comparison chart Image Description:
Camera angle: Clean data visualization
Visual: Bar chart showing MAGMA (0.70) vs baselines (Full Context 0.48, A-MEM 0.58, MemoryOS 0.55, Nemori 0.59)
Color: MAGMA bar in gold/victory color, others in gray
Lighting: Spotlight on MAGMA bar
Text Elements:
Title: "LoCoMo Benchmark Results"
Label: "LLM-as-Judge Score"
Callout: "45.5% improvement over Full Context"
Panel 2 (Size: 1/4 page, Position: Bottom-left)
Scene: Temporal reasoning strength Image Description:
Visual: Temporal accuracy comparison
Highlight: MAGMA excels at time-based queries
Text Elements:
Caption: "Excels at WHEN questions"
Data: "Temporal: 0.650 vs 0.422-0.649"
Panel 3 (Size: 1/4 page, Position: Bottom-right)
Scene: Adversarial robustness Image Description:
Visual: Shield with MAGMA logo deflecting attacks
Lighting: Protective glow
Text Elements:
Caption: "Robust against adversarial distractors"
Data: "Adversarial: 0.742 score"
Page Hook: "The future of AI memory is structured"
Visual Prompt: Results page with data visualization. Top half: Clean bar chart showing benchmark scores - MAGMA bar glowing gold at 0.70, other baselines in gray (Full Context 0.481, A-MEM 0.58, MemoryOS 0.553, Nemori 0.59). "45.5% improvement" callout. Bottom left: Small panel highlighting temporal reasoning with timeline icon and "0.650" score highlighted. Bottom right: Shield icon with MAGMA logo deflecting attack arrows, "0.742" adversarial score. Clean tech aesthetic, victory colors, professional data visualization style.
Page 8 / 8
Filename: 08-page-conclusion.png Layout: splash Narrative Layer: Conclusion and vision Core Message: The future of AI memory is multi-dimensional
Section Summary
What this section covers: MAGMA's significance for the future of AI agents - enabling transparent reasoning paths, lower latency, and foundations for even more sophisticated agent architectures.
Key takeaway: By thinking of memory as multi-dimensional structure rather than a simple database, we unlock AI systems that can truly reason over their experiences.
Panel Layout
Panel Count: 1 Layout Type: full splash
Panel 1 (Size: Full page)
Scene: Vision of the future Image Description:
Camera angle: Epic wide shot
Visual: An AI agent confidently navigating through a beautifully organized multi-graph memory space
The four graph types orbit around the agent like dimensions
Environment: Vast cosmic space representing infinite memory capacity
Lighting: Hopeful, bright core with ambient glow
Color tone: All four graph colors harmonizing
Text Elements:
Quote: "Memory is no longer implicit in internal activations but becomes a persistent, queryable resource"
Caption: "The future of AI memory is STRUCTURED"
Footer: "MAGMA - github.com/FredJiang0324/MAMGA"
Page Hook: (Ending) "Give your agents the memory they deserve."
Visual Prompt: Epic conclusion splash page. A confident humanoid AI agent stands at the center of a vast cosmic space, surrounded by four orbiting graph structures - cyan semantic, green temporal, orange causal, purple entity - all harmoniously interconnected. The agent reaches out, navigating effortlessly through the structured memory space. Bright hopeful lighting from the center, stars and data particles in the background. Quote text overlay: "Memory becomes a persistent, queryable resource". Footer with GitHub link. Tech aesthetic, cinematic, inspiring, epic scale.
Last updated