Skip to main content

Memori Python SDK

Project description

Memori Labs

The memory fabric for enterprise AI

Memori plugs into the software and infrastructure you already use. It is LLM, datastore and framework agnostic and seamlessly integrates into the architecture you've already designed.

Memori%2fLabs%2FMemori | Trendshif

PyPI version Downloads License Python 3.8+ Discord

Give a Star


Getting Started

Install Memori:

pip install memori

Quickstart Example

import os
import sqlite3

from memori import Memori
from openai import OpenAI


def get_sqlite_connection():
    return sqlite3.connect("memori.db")


client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

memori = Memori(conn=get_sqlite_connection).llm.register(client)
memori.attribution(entity_id="123456", process_id="test-ai-agent")
memori.config.storage.build()

response = client.chat.completions.create(
    model="gpt-4.1-mini",
    messages=[
        {"role": "user", "content": "My favorite color is blue."}
    ]
)
print(response.choices[0].message.content + "\n")

# Advanced Augmentation runs asynchronously to efficiently
# create memories. For this example, a short lived command
# line program, we need to wait for it to finish.

memori.augmentation.wait()

# Memori stored that your favorite color is blue in SQLite.
# Now reset everything so there's no prior context.

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

memori = Memori(conn=get_sqlite_connection).llm.register(client)
memori.attribution(entity_id="123456", process_id="test-ai-agent")

response = client.chat.completions.create(
    model="gpt-4.1-mini",
    messages=[
        {"role": "user", "content": "What's my favorite color?"}
    ]
)
print(response.choices[0].message.content + "\n")

Explore the Memories

/bin/echo "select * from memori_conversation_message" | /usr/bin/sqlite3 memori.db
/bin/echo "select * from memori_entity_fact" | /usr/bin/sqlite3 memori.db
/bin/echo "select * from memori_process_attribute" | /usr/bin/sqlite3 memori.db
/bin/echo "select * from memori_knowledge_graph" | /usr/bin/sqlite3 memori.db

What's New In v3?

  • Significant performance improvements using Advanced Augmentation.
  • Threaded, zero latency replacement for the v2 extraction agent.
  • LLM agnostic with support for all of the major foundational models.
  • Datastore agnostic with support for all major databases and document stores.
  • Adapter/driver architecture to make contributions easier.
  • Vectorized memories and in-memory semantic search for more accurate context.
  • Third normal form schema including storage of semantic triples for a knowledge graph.
  • Reduced development overhead to a single line of code.
  • Automatic schema migrations.

Attribution

To get the most out of Memori, you want to attribute your LLM interactions to an entity (think person, place or thing; like a user) and a process (think your agent, LLM interaction or program).

If you do not provide any attribution, Memori cannot make memories for you.

mem.attribution(entity_id="12345", process_id="my-ai-bot")

Session Management

Memori uses sessions to group your LLM interactions together. For example, if you have an agent that executes multiple steps you want those to be recorded in a single session.

By default, Memori handles setting the session for you but you can start a new session or override the session by executing the following:

mem.new_session()

or

session_id = mem.config.session_id

# ...

mem.set_session(session_id)

Suggested Setup

To make sure everything is installed in the most efficient manner, we suggest you execute the following once:

python -m memori setup

This step is not necessary but will prep your environment for faster execution. If you do not perform this step, it will be executed the first time Memori is run which will cause the first execution (and only the first one) to be a little slower.

Configure Your Database

  1. Run this command once, via CI/CD or anytime you update Memori.

    Memori(conn=db_session_factory).config.storage.build()
    
  2. Instantiate Memori with the connection factory.

    from openai import OpenAI
    from memori import Memori
    
    client = OpenAI(...)
    mem = Memori(conn=db_session_factory).llm.register(client)
    

Supported LLM

  • Anthropic
  • Bedrock
  • Gemini
  • Grok (xAI)
  • OpenAI (Chat Completions & Responses API)

(unstreamed, streamed, synchronous and asynchronous)

Supported Frameworks

  • Agno
  • LangChain

Supported Platforms

  • Nebius AI Studio

Supported Database Integrations

  • DB API 2.0 - Direct support for any Python database driver that implements the PEP 249 Database API Specification v2.0. This includes drivers like psycopg, pymysql, MySQLdb, cx_Oracle, oracledb, and sqlite3.
  • Django - Native integration with Django's ORM and database layer
  • SQLAlchemy

Supported Datastores

  • CockroachDB - Full example with setup instructions
  • MariaDB
  • MongoDB - Full example with setup instructions
  • MySQL
  • OceanBase - Full example with setup instructions
  • Neon - Full example with setup instructions
  • Oracle
  • PostgreSQL - Full example with setup instructions
  • SQLite - Full example with setup instructions
  • Supabase

Examples

For more examples and demos, check out the Memori Cookbook.

Memori Advanced Augmentation

Memories are tracked at several different levels:

  • entity: think person, place, or thing; like a user
  • process: think your agent, LLM interaction or program
  • session: the current interactions between the entity, process and the LLM

Memori's Advanced Augmentation enhances memories at each of these levels with:

  • attributes
  • events
  • facts
  • people
  • preferences
  • relationships
  • rules
  • skills

Memori knows who your user is, what tasks your agent handles and creates unparalleled context between the two. Augmentation occurs in the background incurring no latency.

By default, Memori Advanced Augmentation is available without an account but rate limited. When you need increased limits, sign up for Memori Advanced Augmentation or execute the following:

python -m memori sign-up <email_address>

Memori Advanced Augmentation is always free for developers!

Once you've obtained an API key, simply set the following environment variable:

export MEMORI_API_KEY=[api_key]

Managing Your Quota

At any time, you can check your quota by executing the following:

python -m memori quota

Or by checking your account at https://memorilabs.ai/. If you have reached your IP address quota, sign up and get an API key for increased limits.

If your API key exceeds its quota limits we will email you and let you know.

Command Line Interface (CLI)

To use the Memori CLI, execute the following from the command line:

python -m memori

This will display a menu of the available options. For more information about what you can do with the Memori CLI, please reference Command Line Interface.

Contributing

We welcome contributions from the community! Please see our Contributing Guidelines for details on:

  • Setting up your development environment
  • Code style and standards
  • Submitting pull requests
  • Reporting issues

Support


License

Apache 2.0 - see LICENSE


Star us on GitHub to support the project

Star History

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memori-3.1.6.tar.gz (81.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

memori-3.1.6-py3-none-any.whl (119.5 kB view details)

Uploaded Python 3

File details

Details for the file memori-3.1.6.tar.gz.

File metadata

  • Download URL: memori-3.1.6.tar.gz
  • Upload date:
  • Size: 81.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for memori-3.1.6.tar.gz
Algorithm Hash digest
SHA256 854653522f2e327fd9a79d868109d8435b10a5c4d54613f59692503ab47929b3
MD5 5c77278c30837f3f03e9eeb478e4922b
BLAKE2b-256 4de4d6c2111c8b287461d3fa606064513bb96be017af081a8cb9ec87671ce120

See more details on using hashes here.

File details

Details for the file memori-3.1.6-py3-none-any.whl.

File metadata

  • Download URL: memori-3.1.6-py3-none-any.whl
  • Upload date:
  • Size: 119.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for memori-3.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 45b928519c739db79d8c258feb8cd4756ca17f3970c46e35fe2d8227dba046f6
MD5 17516d143917ae6c4f2af72335e5d326
BLAKE2b-256 98e22c3013393e20d20d1270a8d62baaef09fbddd3bc7bdc9ca19606b263cf52

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page