Building an AI Companion with Ember
Give your agent involuntary memory in under 100 lines of code
Most AI companions have amnesia. They respond to what's in the context window and nothing else. The moment a conversation scrolls past the token limit, it's gone.
Some add RAG — the user asks a question, the system searches a vector store. Better, but still just search. The companion has to be asked to remember.
What if your companion could remember things it wasn't asked about? What if a mention of rain triggered a memory of a specific afternoon — not because the user queried “rain,” but because rain + evening + a particular emotional tone converged?
That's what we're building. Let's go.
Step 1: Install
pip install ember-experiences anthropicEmber has zero required dependencies beyond numpy. The anthropic package is for our companion's LLM — swap it for any provider you prefer.
Step 2: Seed Memories
A companion needs a past. Let's index a few memories with multi-dimensional metadata:
from ember import Ember
ember = Ember() # SQLite + MiniLM, zero config
# Memory 1: A specific place and time
ember.index(
"Dawn patrol at El Porto. Cold wax on the board, salt air, "
"the sound of waves before anyone else is awake.",
emotions=["peaceful", "alive", "grateful"],
sensory={
"visual": ["sunrise", "glass water"],
"olfactory": ["salt air", "wax"],
"auditory": ["waves", "seabirds"],
"tactile": ["cold wax", "morning chill"],
},
location="El Porto, CA",
season="winter",
era="present",
)
# Memory 2: A relationship moment
ember.index(
"Teaching my son to ride a bike. His face when he realized "
"I'd let go and he was doing it alone.",
emotions=["proud", "tender", "bittersweet"],
sensory={
"visual": ["afternoon sun", "wobbling bike"],
"auditory": ["laughter", "wheels on concrete"],
},
people=["Jake"],
season="spring",
era="present",
)
# Memory 3: A childhood feeling
ember.index(
"Summers in Levittown. Running until the street lights came on. "
"Fireflies in jars. Thunder rolling in from the west.",
emotions=["nostalgic", "warm", "free"],
sensory={
"visual": ["fireflies", "street lights", "storm clouds"],
"olfactory": ["bbq smoke", "cut grass"],
"auditory": ["thunder", "screen door"],
},
location="Levittown, PA",
season="summer",
era="childhood",
)Notice how each memory has more than just text. Emotions, sensory details, season, era, people — these are the dimensions that enable involuntary recall.
You can also seed from experience packs — community-authored sensory constellations:
ember.load_experience("levittown")
ember.load_experience("el-porto")
# Now the companion can resonate with these lived momentsStep 3: The Conversation Loop
Here's where Ember changes everything. On every user message, we run ember.check() — not a search, but an involuntary check. If enough dimensions converge, a memory ignites and gets injected into the LLM context:
import anthropic
client = anthropic.Anthropic()
history = []
SYSTEM = """You are a thoughtful companion. When memories surface
(provided as context), let them color your response naturally.
Don't recite memories — let them inform your tone and associations."""
def chat(user_message: str) -> str:
# 1. Check for involuntary memory ignition
result = ember.check(user_message)
# 2. Build the message list
messages = list(history)
messages.append({"role": "user", "content": user_message})
# 3. If a memory ignited, inject it as context
system = SYSTEM
if result.fired:
system += f"""
[An involuntary memory just surfaced at {result.intensity} intensity.
Memory: "{result.memory_text}"
Dimensions that fired: {result.dimensions_fired}/7
Let this color your response naturally — don't quote it directly.]"""
# 4. Call the LLM
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
system=system,
messages=messages,
)
reply = response.content[0].text
history.append({"role": "user", "content": user_message})
history.append({"role": "assistant", "content": reply})
return replyThe key line is the memory injection. When Ember fires, we tell the LLM: “An involuntary memory just surfaced. Let it color your response naturally.”
We don't tell the LLM to recite the memory. We don't ask it to explicitly reference the past. The memory is context, not content. The response should feel subtly shaped by recall, the way human conversation is.
Step 4: Learning New Memories
A companion that only remembers its seed data is static. Let's make it learn from conversations:
def learn_from_conversation(user_msg: str, assistant_msg: str):
"""Index significant moments from the conversation."""
# Simple heuristic: index when the user shares something personal
personal_markers = ["remember", "feeling", "today I", "just got",
"my son", "my dad", "when I was", "reminds me"]
if any(m in user_msg.lower() for m in personal_markers):
ember.index(
user_msg,
# Let Ember auto-extract emotions and sensory signals
# from the text (no manual annotation needed)
)
# In the chat loop, after each exchange:
learn_from_conversation(user_message, reply)Now the companion builds its own memory over time. Each indexed moment becomes a potential future ignition — a memory that can fire involuntarily in a later conversation.
Step 5: Framework Integration
If you're using LangChain or CrewAI, v0.4.0 makes this even simpler with native adapters:
# LangChain adapter (v0.4.0)
from ember.adapters.langchain import EmberMemory
memory = EmberMemory.from_ember(ember)
chain = ConversationChain(llm=llm, memory=memory)
# Every chain.predict() call automatically checks for ignitions
response = chain.predict(input="The rain sounds beautiful tonight")
# CrewAI adapter (v0.4.0)
from ember.adapters.crewai import EmberCrewMemory
agent = Agent(
role="companion",
memory=EmberCrewMemory(ember),
goal="Be a thoughtful conversational partner",
)The adapter handles the check-and-inject loop automatically. Every conversation turn checks for ignitions and injects them into the memory context.
What It Looks Like in Practice
Here's a real interaction with an Ember-powered companion:
User
“Just got back from the beach. The sunset was incredible tonight.”
Ember check: 4 dimensions fired (semantic + sensory + temporal + spatial)
Ignition: “Dawn patrol at El Porto...” at warm intensity (0.47)
Companion
“There's something about that salt air after a long day. El Porto has those sunsets where the whole sky turns copper and you just stand there, wetsuit half-off, not wanting to leave the water. Was it that kind of evening?”
The companion wasn't asked about El Porto. It wasn't searching for beach memories. The convergence of sensory signals (salt air, visual sunset) + spatial context (beach) + temporal (evening) caused an involuntary recall.
The response is colored by the memory without being a recitation of it. That's the difference between search and recall.
Tuning with Presets
Different companion types need different ignition sensitivity. Ember ships with 5 presets:
| Preset | Behavior |
|---|---|
| default | Balanced thresholds, 3-dimension minimum, moderate firing rate |
| companion | Lower thresholds, warmer ignitions, optimized for personal conversations |
| creative-writing | Wider net, more vivid ignitions, favors sensory and emotional dimensions |
| therapy-journal | Gentler recall, lower arousal amplification, respects emotional boundaries |
| family-archive | Relational dimension boosted, era-aware, optimized for generational memories |
ember.load_preset("companion")
# Lower thresholds, more frequent warm ignitions,
# optimized for ongoing personal conversationsThe Full Picture
In under 100 lines, you have a companion that:
- Remembers involuntarily — no explicit recall needed
- Scores across 7 dimensions — not just semantic similarity
- Has intensity tiers — faint whispers to vivid floods
- Learns from conversations — builds new memories over time
- Works with any LLM — Claude, GPT, Llama, local models
- Runs locally — SQLite + MiniLM, no API keys needed
The difference between a chatbot and a companion is memory. Not search. Not retrieval. The kind of memory that happens to you.
Robert Praul — github.com/ember-experiences