Skip to main content

Local-first, portable, LLM-agnostic memory system for AI agents

Project description

Ormah

Ormah is the collective, self-maintaining memory layer all your agents can tap into.

The core idea is simple: memory should be involuntary. Your agents should not have to remember to remember. Ormah works in the background, learning your preferences, decisions, patterns, mistakes, and ongoing work, then whispering the right memory at the right time.

Your memory has always been yours. Ormah helps keep it that way.

Local. Private. Portable. Yours to keep. Yours to move.

Ormah knowledge graph

The name comes from the Malayalam word ഓർമ (ormah), meaning "memory" or "remember."

Memory Should Whisper

In real life, memory does not work like search. When something in front of you connects to something you already know, the memory surfaces on its own. You do not stop and decide to remember.

Ormah is built around that idea. Instead of waiting for an agent to ask for context, Ormah looks at what is happening and whispers the right memory before the agent processes the next prompt, so it starts with the context, preferences, constraints, and hints that matter.

That is what makes Ormah feel like memory instead of search. Search waits to be asked. Memory shows up when it matters.

Silence is better than noise. Ormah should whisper, not shout.

Install

bash <(curl -fsSL https://ormah.me/install.sh)

One command gets you a working local Ormah runtime with setup for supported clients.

Ormah is agent-agnostic by design. It can be wired into any agent that exposes the right hook or prompt-injection path, and it also exposes CLI, MCP, and HTTP surfaces.

ormah setup will:

  1. Start the Ormah server and install auto-start
  2. Preload the local models used for search and whisper retrieval
  3. Detect supported clients and wire them up automatically
  4. Offer agent-backed maintenance when Claude Code or Codex are available
  5. Offer transcript backfill to help bootstrap memory from earlier sessions

Today, setup can wire up:

  • Claude Code
  • Codex
  • Claude Desktop (MCP)

Local search, embeddings, storage, the graph UI, and whisper retrieval do not require an API key. If you want Ormah's LLM-backed features to run independently of your agent, you can configure your own provider and API key.

Features

Recall and Whisper

Ormah supports both deliberate recall and involuntary recall.

When an agent knows it needs something, it can explicitly search memory. But memory should not always wait to be asked. Ormah is built to whisper the right memory at the right time, before the next prompt, so the agent starts with context instead of having to go looking for it.

Read more: Whisper - Involuntary Recall, Search and Ranking, Affinity and Feedback

Memory Capture

Memory is only useful if it keeps growing with you.

Ormah can capture memory from ongoing sessions, stored transcripts, and external markdown sources. whisper store turns conversations into durable memory, the session watcher ingests completed sessions automatically, and Hippocampus watches note directories so project docs, journals, and markdown knowledge can flow into the graph over time.

Read more: Hippocampus and Session Watcher, Storage Layer

Self-Maintaining Memory

Memory should not become a junk drawer.

Ormah continuously maintains the graph in the background: linking related memories, detecting conflicts, tracking belief evolution, merging duplicates, consolidating overlap, scoring importance, and decaying stale context. Some of that work is automatic, and some of it can be delegated to an agent when judgment is required.

Read more: Background Jobs

Agent-Agnostic Surfaces

Ormah is not tied to a single agent.

It can integrate wherever there is a usable hook or interface. Ormah exposes multiple surfaces for that: hooks for whisper, MCP for tool-calling agents, a CLI for direct workflows, and an HTTP API for custom integrations. The memory layer stays the same even when the agent changes.

Read more: MCP and Adapters, API Surface, Setup and Installation

Agent-Assisted or Independent

Ormah can use the intelligence of the agents you already work with, like Codex or Claude Code, for judgment-heavy tasks such as maintenance. But it does not have to depend on them. If you want Ormah to run those features independently, you can configure your own provider and API key.

Read more: Configuration Reference, Setup and Installation

Graph UI

Memory should be inspectable.

Ormah includes a graph UI so you can see what it knows, how memories connect, what is becoming central, and where conflicts or belief changes are forming. That makes the system easier to trust, debug, and improve.

Read more: Web UI

Integrations

Ormah is agent-agnostic, but it already has first-class integrations for:

  • Claude Code — whisper hooks, MCP, transcript backfill, maintenance agent
  • Codex — whisper hooks, MCP, maintenance agent
  • Claude Desktop (macOS) — MCP

It can also be used through:

  • MCP for compatible clients
  • the CLI for terminal workflows
  • the HTTP API for local apps and custom integrations
  • OpenAI-compatible tool schemas for custom tool-calling stacks

Main integration surfaces:

  • Hooks — whisper before the next prompt
  • MCP — remember, recall, recall_node, mark_outdated, submit_feedback, run_maintenance
  • CLI — setup, server management, memory ops, ingestion, whisper hooks, evals
  • HTTP API — /agent/*, /admin/*, /ingest/*, /ui/*

Read more: Setup and Installation, MCP and Adapters, API Surface, Configuration Reference

Development

git clone https://github.com/r-spade/ormah.git
cd ormah
make install
uv run pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ormah-0.9.8.tar.gz (406.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ormah-0.9.8-py3-none-any.whl (402.6 kB view details)

Uploaded Python 3

File details

Details for the file ormah-0.9.8.tar.gz.

File metadata

  • Download URL: ormah-0.9.8.tar.gz
  • Upload date:
  • Size: 406.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ormah-0.9.8.tar.gz
Algorithm Hash digest
SHA256 51c0942c39091a99166edc0f03b2bf07e21ba93d247fc6df0973118523ccef6c
MD5 475ab2a3cec452a809792738fdd33d98
BLAKE2b-256 39753409d70d2f7ce185b37b51944d4adf4261719bb9043e87423ddd3ad9b845

See more details on using hashes here.

File details

Details for the file ormah-0.9.8-py3-none-any.whl.

File metadata

  • Download URL: ormah-0.9.8-py3-none-any.whl
  • Upload date:
  • Size: 402.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ormah-0.9.8-py3-none-any.whl
Algorithm Hash digest
SHA256 fa8de3027208eb5c2413dc2eea545630128171a936ab548af6dba6f919c31525
MD5 c228630bfba077c9c36b114cc74b6fda
BLAKE2b-256 31eff940ea148d578313947c03fa7b4392db8f873db0d1aba91a4b29ff10611f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page