The memory fabric for enterprise AI.
Memori adds persistent memory to your LLM applications without changing your architecture. It is model, framework, and datastore agnostic.
Memori captures LLM interactions, enriches them, and makes them retrievable as high-quality context for future generations.
- Low integration overhead: wrap your existing LLM client and keep your current stack.
- Attribution-aware memory: organize memory by
entity,process, andsession. - Asynchronous augmentation: extract structured memory without adding user-facing latency.
- Flexible infrastructure: supports multiple models, frameworks, and databases.
pip install memoriOptional one-time optimization:
python -m memori setupimport os
import sqlite3
from memori import Memori
from openai import OpenAI
def get_sqlite_connection():
return sqlite3.connect("memori.db")
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
memori = Memori(conn=get_sqlite_connection).llm.register(client)
# Required so Memori can store contextual memory for this actor/workflow.
memori.attribution(entity_id="user_123", process_id="assistant_demo")
memori.config.storage.build()
client.chat.completions.create(
model="gpt-4.1-mini",
messages=[{"role": "user", "content": "My favorite color is blue."}],
)
# Wait for async augmentation in short-lived scripts.
memori.augmentation.wait()
response = client.chat.completions.create(
model="gpt-4.1-mini",
messages=[{"role": "user", "content": "What is my favorite color?"}],
)
print(response.choices[0].message.content)Attribution tells Memori who the interaction belongs to and what workflow produced it.
memori.attribution(entity_id="user_123", process_id="support_agent")If attribution is not set, memory cannot be built correctly.
Sessions group related interactions together.
memori.new_session()Reuse an existing session:
session_id = memori.config.session_id
memori.set_session(session_id)Build or migrate storage schema during deployment or after upgrades:
Memori(conn=db_session_factory).config.storage.build()- OpenAI
- Anthropic
- Google Gemini
- xAI (Grok)
- Bedrock (via LangChain)
Supports sync, async, streamed, and unstreamed interaction modes.
- LangChain
- Agno
- Pydantic AI
- Nebius AI Studio
- SQLAlchemy
- DB API 2.0 (PEP 249 drivers such as
psycopg,pymysql,sqlite3, and others) - Django ORM integration
- SQLite
- PostgreSQL
- MySQL
- MariaDB
- Oracle
- MongoDB
- Neon
- Supabase
- CockroachDB
Memori can enrich captured conversations into structured memory such as:
- attributes
- events
- facts
- people
- preferences
- relationships
- rules
- skills
Augmentation runs asynchronously and is available without an account (rate limited).
Get higher limits:
python -m memori sign-up <email_address>Set your API key:
export MEMORI_API_KEY=<api_key>Check usage quota:
python -m memori quotapython -m memoriSee full CLI docs in docs/cli.md.
Contributions are welcome. See CONTRIBUTING.md for setup, standards, and PR guidance.
Apache 2.0. See LICENSE.