Skip to main content

Memori Python SDK

Project description

Memori Labs

The memory fabric for enterprise AI

Memori plugs into the software and infrastructure you already use. It is LLM, datastore and framework agnostic and seamlessly integrates into the architecture you've already designed.

Memori Cloud — Zero config. Get an API key and start building in minutes.

Memori%2fLabs%2FMemori | Trendshif

PyPI version Downloads License Python 3.10+ Discord

Give a Star


Getting Started

Install Memori:

pip install memori

Quickstart

Sign up at app.memorilabs.ai, get a Memori API key, and start building. Full docs: memorilabs.ai/docs/memori-cloud/.

Set MEMORI_API_KEY and your LLM API key (e.g. OPENAI_API_KEY), then:

from memori import Memori
from openai import OpenAI

# Requires MEMORI_API_KEY and OPENAI_API_KEY in your environment
client = OpenAI()
mem = Memori().llm.register(client)

mem.attribution(entity_id="user_123", process_id="support_agent")

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "My favorite color is blue."}]
)
# Conversations are persisted and recalled automatically.

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What's my favorite color?"}]
)
# Memori recalls that your favorite color is blue.

Explore the Memories

Use the Dashboard — Memories, Analytics, Playground, and API Keys.

Want to use your own database? Check out docs for Memori BYODB here: https://memorilabs.ai/docs/memori-byodb/.

Attribution

To get the most out of Memori, you want to attribute your LLM interactions to an entity (think person, place or thing; like a user) and a process (think your agent, LLM interaction or program).

If you do not provide any attribution, Memori cannot make memories for you.

mem.attribution(entity_id="12345", process_id="my-ai-bot")

Session Management

Memori uses sessions to group your LLM interactions together. For example, if you have an agent that executes multiple steps you want those to be recorded in a single session.

By default, Memori handles setting the session for you but you can start a new session or override the session by executing the following:

mem.new_session()

or

session_id = mem.config.session_id

# ...

mem.set_session(session_id)

Supported LLMs

  • Anthropic
  • Bedrock
  • Gemini
  • Grok (xAI)
  • OpenAI (Chat Completions & Responses API)

(unstreamed, streamed, synchronous and asynchronous)

Supported Frameworks

  • Agno
  • LangChain

Supported Platforms

  • Nebius AI Studio

Examples

For more examples and demos, check out the Memori Cookbook.

Memori Advanced Augmentation

Memories are tracked at several different levels:

  • entity: think person, place, or thing; like a user
  • process: think your agent, LLM interaction or program
  • session: the current interactions between the entity, process and the LLM

Memori's Advanced Augmentation enhances memories at each of these levels with:

  • attributes
  • events
  • facts
  • people
  • preferences
  • relationships
  • rules
  • skills

Memori knows who your user is, what tasks your agent handles and creates unparalleled context between the two. Augmentation occurs in the background incurring no latency.

By default, Memori Advanced Augmentation is available without an account but rate limited. When you need increased limits, sign up for Memori Advanced Augmentation or execute the following:

python -m memori sign-up <email_address>

Memori Advanced Augmentation is always free for developers!

Once you've obtained an API key, simply set the following environment variable:

export MEMORI_API_KEY=[api_key]

Managing Your Quota

At any time, you can check your quota by executing the following:

python -m memori quota

Or by checking your account at https://app.memorilabs.ai/. If you have reached your IP address quota, sign up and get an API key for increased limits.

If your API key exceeds its quota limits we will email you and let you know.

Command Line Interface (CLI)

To use the Memori CLI, execute the following from the command line:

python -m memori

This will display a menu of the available options. For more information about what you can do with the Memori CLI, please reference Command Line Interface.

Contributing

We welcome contributions from the community! Please see our Contributing Guidelines for details on:

  • Setting up your development environment
  • Code style and standards
  • Submitting pull requests
  • Reporting issues

Support


License

Apache 2.0 - see LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memorisdk-3.2.1.tar.gz (84.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

memorisdk-3.2.1-py3-none-any.whl (126.2 kB view details)

Uploaded Python 3

File details

Details for the file memorisdk-3.2.1.tar.gz.

File metadata

  • Download URL: memorisdk-3.2.1.tar.gz
  • Upload date:
  • Size: 84.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for memorisdk-3.2.1.tar.gz
Algorithm Hash digest
SHA256 97c23b549a7d51d82f40c04cd9943b1413b8ed02ee01b874e2dd938ebd24d50b
MD5 0ef5e7236ff3d04e92b85a89a0feb674
BLAKE2b-256 1832ec16e0391748ff1175355c7823f26fd166465e0e42aa4583b3047e18db9f

See more details on using hashes here.

File details

Details for the file memorisdk-3.2.1-py3-none-any.whl.

File metadata

  • Download URL: memorisdk-3.2.1-py3-none-any.whl
  • Upload date:
  • Size: 126.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for memorisdk-3.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 68b53ce53a05cead89a09a1de2f9152919923219f0a35503d53edfaa421ee2dc
MD5 d4e9dec39888d72e36292e78a3728ede
BLAKE2b-256 3d6e40ed6225c0d9c439b5ff9ce0a7a5846373644fe52e5fa6b090ea48238a3d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page