Documentation
Getting Started
v0.4.0 — 228 tests — Apache 2.0
Installation
pip install ember-experiencesZero infrastructure required. Ships with SQLite storage and local MiniLM-L6-v2 embeddings. No database server, no API keys.
Optional extras
# LangChain adapter
pip install ember-experiences[langchain]
# CrewAI adapter
pip install ember-experiences[crewai]
# PostgreSQL + pgvector backend
pip install ember-experiences[postgres]
# OpenAI embeddings (instead of local MiniLM)
pip install ember-experiences[openai]Quickstart
from ember import Ember
# Zero-config: SQLite storage + local MiniLM embeddings
ember = Ember()
# Index a memory with multi-dimensional metadata
ember.index(
"Summers in Philadelphia, running until the street lights come on. "
"Chasing fireflies. Thunder and the first drops of rain on pavement.",
emotions=["nostalgic", "warm", "alive"],
sensory={
"visual": ["fireflies", "street lights"],
"olfactory": ["bbq smoke", "fireworks"],
"auditory": ["thunder", "rain on pavement"],
},
location="Levittown, PA",
season="summer",
era="childhood",
)
# Check for ignition on every message — involuntary, no search intent
result = ember.check("lightning bugs on the porch, someone grilling...")
if result.fired:
print(f"Ignition! Intensity: {result.intensity}")
print(f"Dimensions fired: {result.dimensions_fired}/7")
print(f"Score: {result.score:.2f}")Core Concepts
Signal Constellation
Every message and every indexed memory is decomposed into a 7-dimensional fingerprint: semantic, emotional, sensory, temporal, spatial, relational, and musical. This fingerprint is compared across all stored memories to find convergence.
Ignition
When 3+ dimensions converge between the current message and a stored memory, the memory “ignites.” The composite score determines intensity: faint (0.28–0.40), warm (0.40–0.60), or vivid (0.60+). Each ignition has a 24-hour refractory period.
Experience Packs
YAML files containing authored sensory constellations — lived moments encoded as data. Load them to expand your agent's palette of recognizable memories. Community-authored and shareable.
Presets
Pre-tuned ignition configurations for common use cases: default, companion, creative-writing, therapy-journal, and family-archive. Each adjusts thresholds, dimension weights, and refractory periods.
Architecture
Every incoming message is decomposed into a signal constellation — a multi-dimensional fingerprint of what this moment feels like. This runs against every indexed memory simultaneously.
Message → Signal Constellation → Vector Search → Multi-Dimensional Scoring → Ignition
┌─────────────────────────────────────────────────┐
│ Signal Constellation │
│ │
│ Semantic ─────── 384d embedding (MiniLM) │
│ Emotional ────── valence / arousal / labels │
│ Sensory ──────── visual / auditory / olfactory │
│ Relational ───── people / trust level │
│ Temporal ─────── time / season / era │
│ Spatial ──────── location / GPS / place type │
│ Musical ──────── artist / genre / tempo │
└─────────────────────────────────────────────────┘
│
Vector Search
(cosine similarity)
│
Candidate Embers
│
Multi-Dimensional Scoring
(per-dimension thresholds)
│
Ignition Gates
┌─────────┼─────────┐
│ │ │
Faint Warm VividThe seven dimensions
API Reference
Ember(backend=None, embedding_provider=None)
Create an Ember instance. Defaults to SQLite backend and local MiniLM embeddings.
from ember import Ember
# Default: SQLite + MiniLM (zero config)
ember = Ember()
# PostgreSQL + pgvector
from ember.backends.postgres import PostgresBackend
ember = Ember(backend=PostgresBackend(connection_string="postgresql://..."))
# OpenAI embeddings
from ember.embeddings.openai import OpenAIEmbeddingProvider
ember = Ember(embedding_provider=OpenAIEmbeddingProvider(api_key="..."))ember.index(text, **kwargs)
Index a memory with multi-dimensional metadata. Returns the stored ember ID.
ember.index(
"Dawn patrol at El Porto, cold wax and salt air",
emotions=["peaceful", "alive"],
sensory={"olfactory": ["salt air", "wax"], "auditory": ["waves"]},
location="El Porto, CA",
latitude=33.895,
longitude=-118.421,
season="winter",
era="present",
people=["Jake"],
music={"artist": "Tycho", "genre": "ambient"},
)ember.check(text, context=None)
Check for involuntary ignition. Returns an IgnitionResult. Run this on every incoming message.
result = ember.check(
"The waves were perfect this morning",
context={"location": {"lat": 33.896, "lon": -118.422}},
)
if result.fired:
print(result.intensity) # "warm"
print(result.score) # 0.52
print(result.dimensions_fired) # 5
print(result.memory_text) # "Dawn patrol at El Porto..."
print(result.dimension_scores) # {"semantic": 0.71, "spatial": 0.92, ...}ember.load_experience(name)
Load a community experience pack by name. Bundled packs: levittown, el-porto, tokyo-after-midnight.
ember.load_experience("levittown")
ember.load_experience("el-porto")
ember.load_experience("tokyo-after-midnight")ember.load_preset(name)
Load ignition preset. Available: default, companion, creative-writing, therapy-journal, family-archive.
ember.load_preset("companion") # lower thresholds, warmer ignitionsregister_dimension(name) / register_extractor(name)
Register custom dimensions and extractors that participate in the convergence pipeline.
from ember import register_dimension, register_extractor
@register_dimension("culinary")
def score_culinary(memory_data, message_data):
mem = set(memory_data.get("foods", []))
msg = set(message_data.get("foods", []))
if not mem or not msg:
return 0.0
return len(mem & msg) / len(mem | msg)
@register_extractor("culinary")
def extract_culinary(text):
return {"foods": detected_foods}Ignition Tiers
Recall has temperature. A faint echo feels different from a vivid flood.
Experience Packs
Experience packs are authored collections of sensory constellations — lived moments encoded as data. Load them to expand your agent's palette.
# Load a community experience pack
ember.load_experience("summer-in-levittown-1978")
# Now any message with convergent sensory signals can ignite
# — fireflies, bbq smoke, summer rain, childhood evening —
# because someone who lived that moment contributed the palette.
# Load a tuned preset for your use case
ember.load_preset("creative-writing") # wider net, more vivid ignitions
ember.load_preset("therapy-journal") # lower thresholds, gentler recallAuthoring a pack
# experience-pack.yaml
name: summer-in-levittown-1978
author: Robert Praul
description: East Coast suburban summer, childhood
embers:
- text: "Running until the street lights come on"
emotions: [free, nostalgic, alive]
sensory:
visual: [fireflies, street lights, storm clouds]
olfactory: [bbq smoke, cut grass, fireworks]
auditory: [thunder, screen door, ice cream truck]
season: summer
era: childhood
location: Levittown, PACustom Dimensions
Extend Ember with domain-specific scoring dimensions. Custom dimensions participate in the same convergence pipeline as built-in ones.
from ember import register_dimension, register_extractor
@register_dimension("culinary", weight=0.08)
def score_culinary(memory_data, message_data):
"""Custom dimension: food and flavor matching."""
mem_foods = set(memory_data.get("foods", []))
msg_foods = set(message_data.get("foods", []))
if not mem_foods or not msg_foods:
return 0.0
return len(mem_foods & msg_foods) / len(mem_foods | msg_foods)
@register_extractor("culinary")
def extract_culinary(text):
"""Extract food references from text."""
food_vocab = {"ramen", "miso", "coffee", "cinnamon", ...}
found = [w for w in text.lower().split() if w in food_vocab]
return {"foods": found}Storage Backends
Ember abstracts storage to four methods. Swap backends without changing application code.
# SQLite (default — zero config)
ember = Ember()
# PostgreSQL + pgvector (production)
from ember.backends.postgres import PostgresBackend
ember = Ember(backend=PostgresBackend(connection_string="..."))
# In-memory (testing)
from ember.backends.memory import InMemoryBackend
ember = Ember(backend=InMemoryBackend())
# The backend interface is 4 methods:
# vector_search(embedding, threshold, limit) → list[dict]
# get_recent_ignitions(hours) → list[dict]
# store_ember(row) → dict
# record_ignition(ignition) → NoneFramework Integration
v0.4.0 ships native adapters for LangChain and CrewAI. Both handle the check-and-inject loop automatically.
LangChain
from ember.adapters.langchain import EmberMemory
from langchain.chains import ConversationChain
memory = EmberMemory.from_ember(ember)
chain = ConversationChain(llm=llm, memory=memory)
# Every call to chain.predict() automatically checks for ignitions
# and injects them into the conversation context
response = chain.predict(input="The rain sounds beautiful tonight")CrewAI
from ember.adapters.crewai import EmberCrewMemory
from crewai import Agent
agent = Agent(
role="companion",
memory=EmberCrewMemory(ember),
goal="Be a thoughtful conversational partner",
)
# Memories surface involuntarily during task executionAny framework
# Works with any framework — just call check() on every message
result = ember.check(user_message)
if result.fired:
# Inject the memory into your LLM's system prompt
system_prompt += f"""
[A memory surfaced at {result.intensity} intensity:
"{result.memory_text}"
Let this color your response naturally.]
"""Presets
Pre-tuned ignition configurations for common use cases.
| Preset | Behavior |
|---|---|
| default | Balanced thresholds, 3-dimension minimum, moderate firing rate |
| companion | Lower thresholds, warmer ignitions, optimized for personal conversations |
| creative-writing | Wider net, more vivid ignitions, favors sensory and emotional dimensions |
| therapy-journal | Gentler recall, lower arousal amplification, respects emotional boundaries |
| family-archive | Relational dimension boosted, era-aware, optimized for generational memories |
ember.load_preset("companion") # optimized for personal conversationsContributing
We welcome contributions — custom dimensions, experience packs, backend adapters, and framework integrations. See CONTRIBUTING.md on GitHub.
Submit experience packs directly through the submission portal.