Skip to main content

MemoryLayer.ai - API-first memory infrastructure for LLM-powered agents (open source core)

Project description

MemoryLayer.ai

API-first memory infrastructure for LLM-powered agents.

MemoryLayer provides cognitive memory capabilities for AI agents, including episodic, semantic, procedural, and working memory with vector-based retrieval and graph-based associations.

Features

  • Cognitive Memory Architecture: Episodic, Semantic, Procedural, and Working memory types
  • Vector Search: SQLite with sqlite-vec for efficient similarity search
  • Graph Associations: 25+ relationship types for memory connections
  • MCP Integration: Model Context Protocol server for Claude and other LLMs
  • REST API: FastAPI-based HTTP server
  • Multiple Embedding Providers: OpenAI, Qwen3-VL, vLLM, sentence-transformers

Installation

# Basic installation
pip install memorylayer-server

# With OpenAI embeddings
pip install memorylayer-server[openai]

# With local embeddings (sentence-transformers)
pip install memorylayer-server[local]

# With multimodal support (Qwen3-VL)
pip install memorylayer-server[multimodal]

# All embedding providers
pip install memorylayer-server[embeddings]

Quick Start

HTTP Server

# Start the REST API server
memorylayer serve --port 8080

MCP Server

# Start MCP server for Claude integration
memorylayer mcp

API Usage

from memorylayer import MemoryLayerClient

client = MemoryLayerClient(base_url="http://localhost:8080")

# Store a memory
memory = await client.remember(
    content="User prefers Python for backend development",
    type="semantic",
    importance=0.8,
    tags=["preferences", "programming"]
)

# Recall memories
results = await client.recall(
    query="What programming languages does the user like?",
    limit=5
)

License

Apache 2.0 License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memorylayer_server-0.0.3.tar.gz (147.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

memorylayer_server-0.0.3-py3-none-any.whl (220.6 kB view details)

Uploaded Python 3

File details

Details for the file memorylayer_server-0.0.3.tar.gz.

File metadata

  • Download URL: memorylayer_server-0.0.3.tar.gz
  • Upload date:
  • Size: 147.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for memorylayer_server-0.0.3.tar.gz
Algorithm Hash digest
SHA256 521ca8baa1aec5f1cccf0e7932b6fb0f733048254fd2ef1aef07a911cf266750
MD5 a3ed1d7a694b5f119d5d1c759d943922
BLAKE2b-256 01faa0229ca2afe16773d6fb51909451c8e664c92fa6a9a6415d318c6a6cfeaa

See more details on using hashes here.

Provenance

The following attestation bundles were made for memorylayer_server-0.0.3.tar.gz:

Publisher: release.yml on scitrera/memorylayer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file memorylayer_server-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for memorylayer_server-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 49fcdc31bf676d907514617fe6359aed41faf04ca9a8d42516e4bf1ac2afe8be
MD5 80407ca42ba1dd5eb1e6a00747c6ecb5
BLAKE2b-256 4ce5044e06d34a5f06e47ab3d53305e43371e8c8325915e2217c69f9a33553ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for memorylayer_server-0.0.3-py3-none-any.whl:

Publisher: release.yml on scitrera/memorylayer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page