Skip to main content

Cross-platform agent that collects AI coding tool memory files and syncs to a central server

Project description

memento-collector

Cross-platform agent that automatically collects AI coding tool memory files and syncs them to a central Memento server (a shared brain for your AI coding tools).

Supported AI Tools

Tool Data Collected
Claude Code Conversations, memory, plans, history
OpenClaw Sessions, identity, memory, learnings, skills
Codex Sessions (active + archived), history, skills
Antigravity Full conversations (via aghistory), brain plans, code snapshots
Obsidian All markdown notes in your vault
Cursor Agent transcripts, skills, MCP config

Install

pip install memento-brain-collector

For Antigravity full conversation export:

pip install memento-brain-collector[antigravity]

Quick Start

# Interactive setup (first time)
memento-collector setup

# Or run directly
memento-collector run

The setup wizard will:

  1. Detect your platform (macOS / Linux / Windows)
  2. Auto-discover installed AI tools and Obsidian vaults
  3. Configure the server URL and auth token
  4. Optionally install as a system service (auto-start on boot)

Commands

memento-collector setup      # Interactive setup wizard
memento-collector run        # Run in foreground
memento-collector install    # Install as system service
memento-collector start      # Start the service
memento-collector stop       # Stop the service
memento-collector status     # Show collector status
memento-collector uninstall  # Remove system service

How It Works

  1. File Watching — Uses watchdog (FSEvents on macOS, inotify on Linux, ReadDirectoryChanges on Windows) to detect file changes in real-time
  2. Parsing — Supports Markdown, JSONL, JSON, TOML, SQLite formats
  3. Sanitization — Automatically redacts API keys, tokens, passwords, private keys before upload
  4. Queuing — Local SQLite queue for offline resilience (syncs when server is reachable)
  5. Syncing — HTTP upload to server, with chunked upload for files > 2MB (tested with 37MB files)
  6. Device Identity — Each device gets a persistent unique ID, all data tagged with device info

Configuration

Environment variables (or set via memento-collector setup):

Variable Default Description
MEMENTO_SERVER_URL http://localhost:8001 Server API URL
MEMENTO_SERVER_TOKEN Collector auth token
MEMENTO_OBSIDIAN_VAULT_PATH Auto-detected Obsidian vault path

Config file: ~/.memento/config.json

System Service

Platform Service Type Config Location
macOS LaunchAgent ~/Library/LaunchAgents/com.memento.collector.plist
Linux systemd user ~/.config/systemd/user/memento-collector.service
Windows Task Scheduler MementoCollector scheduled task

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memento_brain_collector-0.0.1.tar.gz (58.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

memento_brain_collector-0.0.1-py3-none-any.whl (69.3 kB view details)

Uploaded Python 3

File details

Details for the file memento_brain_collector-0.0.1.tar.gz.

File metadata

  • Download URL: memento_brain_collector-0.0.1.tar.gz
  • Upload date:
  • Size: 58.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for memento_brain_collector-0.0.1.tar.gz
Algorithm Hash digest
SHA256 db1f0555fb20c5a21893d689295dec9b97b04b1c1434b170ed826e93e083ab7e
MD5 084539fa1ab53f5a4911179ff88554d1
BLAKE2b-256 1c445c43b851721fac30b909cd0a4d03ba26cef8630d90c78597d82f611e92fc

See more details on using hashes here.

File details

Details for the file memento_brain_collector-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for memento_brain_collector-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9e6b399cd5e00ca669bfb098043e2a82488780552ef1863e03ad7c35f75528c2
MD5 1ed99641d8c68dc230287fd1d0fc546e
BLAKE2b-256 e852ecc8ead4b749dbe229f897ef724dbaf421ef03b4957d0202abbe1d9f1e18

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page