← Blog

Memory Is Not Search

How we made AI agents remember things they didn't mean to — and why that changes everything

Robert Praul//8 min read

Stop for a second.

Think about the smell of BBQ smoke on a summer evening.

Did you choose to remember anything?

Probably not. But something surfaced anyway. Maybe a backyard. Maybe a specific person. Maybe a feeling you can't name but absolutely feel.

That's involuntary memory.

Not a query. Not retrieval. It's what happens when enough sensory signals converge and your brain says: I've been here before.

Most AI memory systems today are still search. User asks. System fetches. RAG, vector search, memory retrieval — same assumption: memory requires intent. The agent has to decide to remember.

Ember is the opposite.

The Problem with Search-Based Memory

Here's a typical “memory” system for an AI agent:

User: "Tell me about that restaurant we discussed"
Agent: [searches vector store] → [returns result]

Great for knowledge retrieval. Wrong for memory.

Here's why:

Memory is involuntary. You don't decide to remember the smell of your grandmother's kitchen. You walk past a bakery, catch a whiff of cinnamon, and it happens. The convergence of sensory signals — not a deliberate query — triggers recall.

Memory is multi-dimensional. A single keyword does not trigger real memory. It's the combination: a smell and a season and a time of day and an emotional state. Each dimension alone means very little. Together, they unlock something vivid.

Memory has intensity. Not all memories hit the same way. Some are faint — a vague familiarity that colors your mood. Some are warm — details surface and you find yourself saying “this reminds me of...” Some are vivid — the memory takes over and you're there again.

Search-based systems can't do this. They retrieve on demand. They match on one dimension (semantic similarity). They return results, not experiences.

How Ember Works

Ember decomposes every message into a signal constellation — a multi-dimensional fingerprint across 7 dimensions:

DimensionWhat it captures
SemanticMeaning and topic similarity
EmotionalValence, arousal, emotion labels
SensoryVisual, auditory, olfactory, tactile, gustatory signals
TemporalSeason, time of day, era references
SpatialGPS proximity, location names, place types
RelationalShared people and relationships
MusicalTracks, artists, and musical associations

When you run ember.check() on an incoming message, Ember doesn't search. It extracts the constellation, compares it against stored memories, and checks whether enough dimensions converge.

This is the key insight: a single dimension never fires.

If the message mentions “summer,” that's not enough. If it mentions “summer” and “BBQ smoke” and “evening” and the user is feeling nostalgic — now you have convergence. Now the memory ignites.

The Ignition Pipeline

Every candidate memory passes through 6 gates:

  1. Refractory period — A memory can't fire twice in 24 hours. Like real memory, once recalled, it needs time to reset.
  2. Thematic refractory — Recently fired topics are dampened. Your agent won't fixate on the same theme.
  3. Semantic floor — The message must have minimum semantic relevance to the memory.
  4. Minimum dimensions — At least 3 of 7 dimensions must fire independently. This is the convergence requirement.
  5. Weighted composite — Each dimension contributes to a weighted score. Semantic and emotional carry more weight; spatial and musical are accents.
  6. Threshold check — The composite score must exceed a final threshold, adjusted by speaker state. Reflective conversation lowers it; casual conversation raises it.

What comes out is an IgnitionResult — not a search result. It has a score, an intensity tier (faint/warm/vivid), and a per-dimension breakdown showing exactly which signals converged.

The Code

pip install ember-experiences
from ember import Ember

ember = Ember()

# Index a memory — rich with sensory and emotional detail
ember.index(
    "Summers in Levittown, running until the street lights come on. "
    "Fireflies, thunder, BBQ smoke in the air.",
    emotions=["nostalgic", "warm", "alive"],
    sensory={
        "visual": ["fireflies", "street lights"],
        "olfactory": ["bbq", "fireworks"],
        "auditory": ["thunder"],
    },
    location="Levittown, PA",
    season="summer",
    era="childhood",
)

# Check on every message — involuntary, no search intent
results = ember.check("Lightning bugs on the porch, someone grilling...")

# → IgnitionResult(intensity='warm', score=0.52, dimensions_fired=5)

Zero config. No API keys. No database server. SQLite + MiniLM out of the box. Version 0.4.0 — with LangChain and CrewAI adapters, 228 tests, 5 ignition presets, and community experience packs.

Why This Matters

We're building agents that interact with people over days, weeks, months. An agent that can only search its memory is fundamentally different from one where memory happens to it.

Consider the difference:

Search-based agent

“Based on our previous conversations, you mentioned enjoying summer evenings.”

Clinical. Retrieved. Feels like a lookup.

Ember-powered agent

The user mentions fireflies and the smell of a grill. The agent doesn't search for anything — but a childhood memory ignites at warm intensity. The response is subtly colored by nostalgia it didn't choose to feel.

The memory is present. The difference between a database and a life.

Spatial Awareness

Ember 0.3.1 added GPS-based spatial scoring. Index memories with coordinates and pass the user's current location at check time:

ember.index(
    "Dawn patrol at El Porto, cold wax and salt air...",
    location="El Porto, CA",
    latitude=33.895,
    longitude=-118.421,
)

results = ember.check(
    "The waves are great this morning",
    context={"location": {"lat": 33.896, "lon": -118.422}},
)
# Spatial dimension fires — user is physically near the memory

Being near a place is enough to activate the spatial dimension. The user doesn't need to mention the location by name.

This is how real memory works — you walk through your old neighborhood and memories surface without being summoned.

Community Layer

Ember has two community layers:

Technical (developers): Register custom dimensions and extractors. Build a culinary dimension that scores food references. Build a weather extractor that pulls real-time conditions. Publish them as pip packages.

Experiential (everyone): Author and share experience packs — lived moments encoded as multi-dimensional data. “Summer in Levittown, 1978” isn't code. It's a sensory constellation that anyone can load into their Ember instance.

ember.load_experience("levittown")             # East Coast summer childhood
ember.load_experience("el-porto")              # California dawn patrol
ember.load_experience("tokyo-after-midnight")  # Neon, ramen, 2AM

When you load an experience pack, your agent gains the ability to recognize those sensory constellations. Not because it searched for them, but because someone who lived them contributed the palette.

Framework Integration (v0.4.0)

As of v0.4.0, Ember ships with native adapters for LangChain and CrewAI. Drop-in memory that fires involuntarily — no plumbing required.

LangChain

pip install ember-experiences[langchain]
from ember.adapters.langchain import EmberMemory

memory = EmberMemory.from_ember(ember)
chain = ConversationChain(llm=llm, memory=memory)

# Ember checks every turn automatically.
# When memories ignite, they appear in the context window.
chain.predict(input="The smell of rain on hot pavement...")

CrewAI

pip install ember-experiences[crewai]
from ember.adapters.crewai import EmberCrewMemory

crew_memory = EmberCrewMemory(ember)
agent = Agent(role="companion", memory=crew_memory, ...)

# CrewAI agents gain involuntary recall.
# Memories surface during task execution without explicit queries.

Both adapters follow the same principle: memories ignite involuntarily on every conversation turn, and the results are injected as context. The agent doesn't search — it remembers.

Full examples on GitHub.

What's Next

  • Ember Cloud — Hosted API with managed storage, embeddings, and analytics
  • Experience marketplace — Browse, collect, and attribute community experience packs
  • More integrations — Omi wearable capture, image generation from ignitions

Try It

pip install ember-experiences

Apache 2.0 — github.com/ember-experiences/ember-experiences

We'd love to see what you build — and what memories you encode.