CLI and TUI for Nowledge Mem - AI memory management
Project description
nmem-cli
A lightweight CLI and TUI for Nowledge Mem - AI memory management that works with any AI agent.
Installation Options
Option 1: Standalone PyPI Package (Recommended for CLI-only users)
pip install nmem-cli
Or with uv:
uv pip install nmem-cli
Option 2: Nowledge Mem Desktop App
If you're using the Nowledge Mem desktop app, the nmem CLI is bundled and can be installed via:
- macOS: Settings → Preferences → Developer Tools → Install CLI
- Linux: Automatically installed via package postinstall
- Windows: Automatically added to PATH during installation
The desktop app includes a bundled Python environment, so no separate Python installation is required.
Requirements
- Python 3.11+ (for PyPI package)
- A running Nowledge Mem server (default:
http://127.0.0.1:14242)
Quick Start
# Check server status
nmem status
# Launch interactive TUI
nmem tui
# List memories
nmem m
# Search memories
nmem m search "python programming"
# List threads
nmem t
Commands
Core Commands
| Command | Description |
|---|---|
nmem status |
Check server connection status |
nmem stats |
Show database statistics |
nmem tui |
Launch interactive terminal UI |
Memory Commands
| Command | Description |
|---|---|
nmem m / nmem memories |
List recent memories |
nmem m search "query" |
Search memories (includes source thread info) |
nmem m show <id> |
Show memory details |
nmem m add "content" |
Add a new memory |
nmem m add --stdin |
Add a memory from piped content |
nmem m update <id> |
Update a memory |
nmem m delete <id> [id ...] |
Delete one or more memories (bulk uses MCP) |
nmem m delete <id> --dry-run |
Preview what would be deleted |
Thread Commands
| Command | Description |
|---|---|
nmem t / nmem threads |
List recent threads |
nmem t list --source openclaw -n 20 |
List recent threads for one source |
nmem t search "query" |
Search threads |
nmem t show <id> |
Show thread with messages |
nmem t create -t "Title" -c "content" |
Create a thread |
nmem t append <id> -m '[{"role":"user","content":"..."}]' |
Append messages to a thread |
nmem t save --from claude-code |
Save Claude Code session as thread |
nmem t save --from codex |
Save Codex session as thread |
nmem t save --from gemini-cli |
Save Gemini CLI session as thread |
nmem t delete <id> |
Delete a thread |
nmem t delete <id> --dry-run |
Preview what would be deleted |
Options
Global Options
nmem --json <command> # Output in JSON format (for scripting)
nmem --api-url <url> # Override API URL
nmem --version # Show version
Search Filters (memories)
nmem m search "query" -l label1 -l label2 # Filter by labels
nmem m search "query" -t week # Time range: today/week/month/year
nmem m search "query" --importance 0.7 # Minimum importance
Persistent Remote Configuration
For long-term remote use, prefer nmem's config file instead of exporting auth on every shell session.
The recommended CLI flow is:
nmem config client set url https://mem.example.com
nmem config client set api-key nmem_your_key
nmem config client show
This writes the same local client config file that integrations like Hermes, Cursor hooks, and OpenClaw read on this machine.
Create manually if needed:
~/.nowledge-mem/config.json
With content:
{
"apiUrl": "https://mem.example.com",
"apiKey": "nmem_your_key"
}
Resolution priority:
--api-urlNMEM_API_URL/NMEM_API_KEY~/.nowledge-mem/config.json- defaults
Use environment variables when you want a temporary override for the current shell, CI job, or integration runtime.
nmem config client ... controls how this machine connects outward to Mem. It is separate from nmem config access ..., which controls how a Mem server is exposed to other devices on your network or through Access Anywhere.
MCP Host Configuration
Direct HTTP MCP clients do not read ~/.nowledge-mem/config.json by themselves. The host owns its MCP transport, so remote Mem needs headers in that host's MCP settings.
Generate the right snippet from the client config you already saved:
nmem config mcp show --host codex
nmem config mcp show --host gemini-cli
nmem config mcp show --host cursor
nmem config mcp show --host claude-desktop
For a fixed Mem space, add --space "Research Agent". The generated snippet includes your API key when one is configured, so paste it only into the target host's private MCP config.
Environment Variables
| Variable | Description | Default |
|---|---|---|
NMEM_API_URL |
API server URL | http://127.0.0.1:14242 |
NMEM_API_KEY |
Optional API key (Bearer auth + proxy-safe fallback) | (unset) |
Remote tunnel examples:
# Quick Tunnel (random URL)
export NMEM_API_URL="https://<random>.trycloudflare.com"
export NMEM_API_KEY="nmem_..."
# Cloudflare account tunnel (stable URL)
export NMEM_API_URL="https://mem.example.com"
export NMEM_API_KEY="nmem_..."
For account mode, the URL is the Cloudflare Route tunnel → Public Hostname value (use domain root only, without /remote-api).
TUI Features
The interactive TUI provides:
- Dashboard: Overview with statistics and recent activity
- Memories: Browse, search, and manage memories
- Threads: View conversation threads
- Graph: Explore the knowledge graph
- Settings: Configure the application
TUI Keybindings
| Key | Action |
|---|---|
1-5 |
Switch tabs |
/ |
Focus search |
? |
Show help |
q |
Quit |
Agent and Pipeline Usage
nmem is designed to work well with AI agents and shell pipelines. Every input can be passed as a flag (no interactive prompts block automation), and --json mode gives structured output for programmatic consumption.
Piping Content
# Pipe content into a new memory
echo "The auth service uses RS256 JWTs" | nmem m add --stdin -t "Auth notes" -l backend
# Pipe content into Working Memory
cat daily_focus.md | nmem wm patch --heading "## Focus Areas" --stdin
Previewing Destructive Actions
# See what would be deleted (with titles) without making changes
nmem m delete mem-abc123 mem-def456 --dry-run
nmem t delete thread-xyz --cascade --dry-run
nmem s delete src-123 --dry-run
Non-Interactive Provider Setup
# All config commands work without interactive menus
nmem config provider set anthropic --api-key sk-ant-...
nmem config provider set openai --api-key sk-... --model gpt-4o
nmem config provider activate anthropic
Idempotent Appends
# Retry-safe thread updates with idempotency keys
nmem t append thread-abc \
-m '[{"role":"assistant","content":"Finding"}]' \
--idempotency-key batch-001
Examples
Script Integration (JSON mode)
# Get memories as JSON
nmem --json m search "meeting notes" | jq '.memories[].title'
# Get source thread for a memory (to fetch full conversation context)
nmem --json m search "auth" | jq '.memories[] | {title, thread: .source_thread.id}'
# Check if server is running
if nmem --json status | jq -e '.status == "ok"' > /dev/null; then
echo "Server is running"
fi
Adding Memories
# Simple memory
nmem m add "Remember to review the PR tomorrow"
# With title and importance
nmem m add "The deployment process requires SSH access" \
-t "Deployment Notes" \
-i 0.8
# With labels (repeatable -l flag)
nmem m add "API uses JWT tokens for auth" \
-t "Auth Notes" \
-l work -l backend
# With custom source (for skills/integrations)
nmem m add "User preference: dark mode" \
-s "skill-settings"
Options for nmem m add:
-t, --title: Memory title-i, --importance: Importance score 0.0-1.0 (default: 0.5)-l, --label: Add label (repeatable for multiple labels)-s, --source: Source identifier (default: "cli")--stdin: Read content from stdin instead of positional argument--unit-type: Knowledge type (fact,preference,decision,plan,procedure,learning,context,event)--event-start,--event-end: When the fact happened (YYYY, YYYY-MM, or YYYY-MM-DD)--when: Temporal context (past,present,future,timeless)
Creating Threads
# From content
nmem t create -t "Debug Session" -c "Started investigating the memory leak"
# Explicit thread id (for deterministic integrations)
nmem t create --id openclaw-session-abc123 -t "OpenClaw Session" -c "Session started"
# From file
nmem t create -t "Code Review" -f review-notes.md
# Append one message
nmem t append openclaw-session-abc123 -c "Follow-up finding" -r assistant
# Append with retry-safe idempotency key
nmem t append openclaw-session-abc123 \
-m '[{"role":"assistant","content":"Follow-up finding","metadata":{"external_id":"oc-msg-42"}}]' \
--idempotency-key openclaw-run-123
Saving AI Coding Sessions
Import conversations from Claude Code, Codex, or Gemini CLI as threads:
# Save current Claude Code session (uses current directory)
nmem t save --from claude-code
# Save from a specific project path
nmem t save --from claude-code -p /path/to/project
# Save all sessions for a project
nmem t save --from claude-code -m all
# Save Codex session with a summary
nmem t save --from codex -s "Implemented auth feature"
# Save Gemini CLI session from the current project
nmem t save --from gemini-cli
How it works:
nmemdiscovers and reads the local agent session files on the machine where you run the command- it parses those transcripts into normalized thread messages
- it then uploads the resulting thread data to your configured Mem server
By default, Claude Code and Codex are discovered from ~/.claude and ~/.codex. If you keep them somewhere else, nmem also respects CLAUDE_CONFIG_DIR and CODEX_HOME automatically.
That means nmem t save --from ... works correctly with remote Mem too: the server does not need direct access to those local agent directories on your laptop.
Options:
--from: Source app (claude-code,codex, orgemini-cli) - required-p, --project: Project directory (default: current dir)-m, --mode:current(latest session) orall(all sessions)-s, --summary: Brief session summary--session-id: Specific session ID--truncate: Truncate large tool results (>10KB)
Re-running the command appends new messages with deduplication.
This import path is distinct from desktop auto-sync and watcher-based ingestion. File watching remains a local server-side capability; explicit CLI save is a client-side capture path.
Related
- Nowledge Mem - The full Nowledge Mem application
- This CLI is also bundled with the main Nowledge Mem desktop app
Author
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nmem_cli-0.8.3.tar.gz.
File metadata
- Download URL: nmem_cli-0.8.3.tar.gz
- Upload date:
- Size: 136.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cc2318b92a4419146bd8f826d5c88bf4eb4dbff5f07a5ce79ca45798d65f0722
|
|
| MD5 |
4596ab7f7fdeb39229fa2ee3ccad165f
|
|
| BLAKE2b-256 |
d554190f036c1235fa74c525548ae9758c39accb78aef5604cec4f28ac491c31
|
File details
Details for the file nmem_cli-0.8.3-py3-none-any.whl.
File metadata
- Download URL: nmem_cli-0.8.3-py3-none-any.whl
- Upload date:
- Size: 146.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d211ca2d127d3a08f828d1e1132d83efd985f5128def657491529198b3cfe754
|
|
| MD5 |
28795fe5525d0c1ce6b4f82b21e69d45
|
|
| BLAKE2b-256 |
04840427e533aed1845dbd0d53ef8c201e5e395e539fc4fe0676933b05ed7767
|