Skip to main content

Robot Memory - Persistent memory system for robotic AI agents

Project description

中文版

robotmem — Let Robots Learn from Experience

Your robot ran 1000 experiments, starting from scratch every time. robotmem stores episode experiences — parameters, trajectories, outcomes — and retrieves the most relevant ones to guide future decisions.

FetchPush experiment: +25% success rate improvement (42% → 67%), CPU-only, reproducible in 5 minutes.

robotmem 30s demo: save → restart → recall

Quick Start

pip install robotmem
from robotmem import learn, recall, save_perception, start_session, end_session

# Start an episode
session = start_session(context='{"robot_id": "arm-01", "task": "push"}')

# Record experience
learn(
    insight="grip_force=12.5N yields highest grasp success rate",
    context='{"params": {"grip_force": {"value": 12.5, "unit": "N"}}, "task": {"success": true}}'
)

# Retrieve experiences (structured filtering + spatial nearest-neighbor)
memories = recall(
    query="grip force parameters",
    context_filter='{"task.success": true}',
    spatial_sort='{"field": "spatial.position", "target": [1.3, 0.7, 0.42]}'
)

# Store perception data
save_perception(
    description="Grasp trajectory: 30 steps, success",
    perception_type="procedural",
    data='{"sampled_actions": [[0.1, -0.3, 0.05, 0.8], ...]}'
)

# End episode (auto-consolidation + proactive recall)
end_session(session_id=session["session_id"])

7 APIs

API Purpose
learn Record physical experiences (parameters / strategies / lessons)
recall Retrieve experiences — BM25 + vector hybrid search with context_filter and spatial_sort
save_perception Store perception / trajectory / force data (visual / tactile / proprioceptive / auditory / procedural)
forget Delete incorrect memories
update Correct memory content
start_session Begin an episode
end_session End an episode (auto-consolidation + proactive recall)

Key Features

Structured Experience Retrieval

Not just vector search — robotmem understands the structure of robot experiences:

# Retrieve only successful experiences
recall(query="push to target", context_filter='{"task.success": true}')

# Find spatially nearest scenarios
recall(query="grasp object", spatial_sort='{"field": "spatial.object_position", "target": [1.3, 0.7, 0.42]}')

# Combine: success + distance < 0.05m
recall(
    query="push",
    context_filter='{"task.success": true, "params.final_distance.value": {"$lt": 0.05}}'
)

Context JSON — 4 Sections

{
    "params":  {"grip_force": {"value": 12.5, "unit": "N", "type": "scalar"}},
    "spatial": {"object_position": [1.3, 0.7, 0.42], "target_position": [1.25, 0.6, 0.42]},
    "robot":   {"id": "fetch-001", "type": "Fetch", "dof": 7},
    "task":    {"name": "push_to_target", "success": true, "steps": 38}
}

Each recalled memory automatically extracts params / spatial / robot / task as top-level fields.

Memory Consolidation + Proactive Recall

end_session automatically triggers:

  • Consolidation: Merges similar memories with Jaccard similarity > 0.50 (protects constraint / postmortem / high-confidence entries)
  • Proactive Recall: Returns historically relevant memories for the next episode

FetchPush Demo

cd examples/fetch_push
pip install gymnasium-robotics
PYTHONPATH=../../src python demo.py  # 90 episodes, ~2 min

Three-phase experiment: baseline → memory writing → memory utilization. Expected Phase C success rate 10-20% higher than Phase A.

Architecture

SQLite + FTS5 + vec0
├── BM25 full-text search (jieba CJK tokenizer)
├── Vector search (FastEmbed ONNX, CPU-only)
├── RRF fusion ranking
├── Structured filtering (context_filter)
└── Spatial nearest-neighbor sorting (spatial_sort)
  • CPU-only, no GPU required
  • Single-file database ~/.robotmem/memory.db
  • MCP Server (7 tools) or direct Python import
  • Web management UI: robotmem web

Comparison

Feature MemoryVLA (Academic) Mem0 (Product) robotmem
Target users Specific VLA models Text AI Robotic AI
Memory format Vectors (opaque) Text Natural language + perception + parameters
Structured filtering No No Yes (context_filter)
Spatial retrieval No No Yes (spatial_sort)
Physical parameters No No Yes (params section)
Installation Compile from paper code pip install pip install
Database Embedded Cloud Local SQLite

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

robotmem-0.1.1.tar.gz (144.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

robotmem-0.1.1-py3-none-any.whl (102.1 kB view details)

Uploaded Python 3

File details

Details for the file robotmem-0.1.1.tar.gz.

File metadata

  • Download URL: robotmem-0.1.1.tar.gz
  • Upload date:
  • Size: 144.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for robotmem-0.1.1.tar.gz
Algorithm Hash digest
SHA256 80dc7bd400a37c3052eb82890489a0cd3ff5453fd207443cd0356cde7f4577a1
MD5 1095070eba4310002c1898e25531105a
BLAKE2b-256 d09d09e796d037103e9642a80dfe2dbd90f314d909fb9e4137e39d3af9dd1b7c

See more details on using hashes here.

File details

Details for the file robotmem-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: robotmem-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 102.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for robotmem-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7d7c7112066c341f2973057c07b5a3f931d471c185f3c3595250e9620dfd0e6c
MD5 5c4ed528fa8fca30697143586413b702
BLAKE2b-256 a96961fd492da554a1e84524a5d512082f39628dde189c631889f82cbaf7d0c2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page