Skip to main content

Discover, analyze, and evolve your best prompts from AI coding sessions

Project description

reprompt

CI PyPI version Python License: MIT

Discover, analyze, and evolve your best prompts from AI coding sessions.

repomix packs your code for AI. reprompt extracts insights from AI.

Every developer's AI session history contains reusable prompt patterns -- scattered across hundreds of session files. reprompt extracts them, deduplicates, analyzes frequency, and builds a personal prompt library that evolves over time.

reprompt demo

Quick Start

pipx install reprompt-cli
reprompt scan
reprompt report
reprompt library

Terminal Report

reprompt -- AI Session Analytics
========================================

 Overview
 Total prompts:        1,247
 Unique (deduped):       832
 Sessions scanned:       156
 Sources: claude-code, openclaw

 Top Prompt Patterns
 #  | Pattern                  | Count | Category
 1  | fix the failing test...  |    42 | debug
 2  | add unit tests for...    |    38 | test
 3  | refactor X to use...     |    27 | refactor

Features

  • Auto-detection -- finds Claude Code and OpenClaw sessions automatically
  • Two-layer dedup -- SHA-256 exact + TF-IDF semantic similarity
  • Hot terms analysis -- TF-IDF discovers your most-used technical terms
  • K-means clustering -- groups similar prompts into themes
  • Prompt library -- extracts high-frequency patterns, auto-categorizes (debug/implement/test/review/refactor/explain/config)
  • Rich reports -- beautiful terminal output with tables and bar charts
  • Multiple formats -- terminal, JSON (for pipelines), Markdown (for docs)
  • Pluggable adapters -- add support for any AI coding tool
  • Prompt search -- find past prompts by keyword across all sessions
  • Zero config -- works out of the box, customize via env vars or TOML

How reprompt Compares

Feature reprompt prompt-manager agent-sessions cclog
Multi-tool support ✅ Claude, OpenClaw, + adapters ✅ Multiple ✅ Multiple ❌ Claude only
Exact dedup (SHA-256)
Semantic dedup (TF-IDF)
Hot terms analysis ✅ TF-IDF
K-means clustering
Pattern library ✅ Auto-categorized
CLI interface ✅ TUI ❌ macOS app
JSON/Markdown export
Pluggable adapters
Zero config

Supported AI Tools

Tool Status Session Path
Claude Code Supported ~/.claude/projects/
OpenClaw / OpenCode Supported ~/.openclaw/ + ~/.opencode/sessions/
Codex CLI Planned (v0.4) ~/.codex/
Aider Planned ~/.aider/
Gemini CLI Planned ~/.gemini/
Continue.dev Via MCP MCP protocol
Zed Via MCP MCP protocol
Cursor Planned --

Usage

# Scan all detected AI tools
reprompt scan

# Scan specific source
reprompt scan --source claude-code

# Scan custom path
reprompt scan --path ~/custom/sessions

# Rich terminal report
reprompt report

# JSON output (for CI/pipelines)
reprompt report --format json

# Search your prompt history
reprompt search "authentication"
reprompt search "debug" --limit 5

# View your prompt library
reprompt library

# Filter by category
reprompt library --category debug

# Export prompt library as Markdown
reprompt library prompts.md

# Database stats
reprompt status

# Auto-scan after sessions
reprompt install-hook

# Cleanup old data
reprompt purge --older-than 90d

MCP Server (Claude Code / Continue.dev / Zed)

reprompt includes an MCP server that exposes your prompt analytics as tools for AI coding assistants.

pip install reprompt-cli[mcp]

# Start the server
reprompt mcp-serve

Register in Claude Code — add to .mcp.json at project root:

{
  "mcpServers": {
    "reprompt": {
      "type": "stdio",
      "command": "reprompt",
      "args": ["mcp-serve"]
    }
  }
}

Available tools: search_prompts, get_prompt_library, get_best_prompts, get_trends, get_status, scan_sessions

Available resources: reprompt://status, reprompt://library

Configuration

Zero config by default. Customize with environment variables or TOML:

# Environment variables (prefix: REPROMPT_)
REPROMPT_EMBEDDING_BACKEND=ollama reprompt scan
REPROMPT_DB_PATH=~/custom/reprompt.db reprompt status
# ~/.config/reprompt/config.toml
[embedding]
backend = "tfidf"  # tfidf | ollama | local | openai

[storage]
db_path = "~/.local/share/reprompt/reprompt.db"

[dedup]
semantic_threshold = 0.85

[library]
min_frequency = 3

Optional Backends

pip install reprompt-cli[ollama]   # Ollama API embeddings
pip install reprompt-cli[local]    # sentence-transformers (CPU)
pip install reprompt-cli[openai]   # OpenAI API embeddings

Adding an Adapter

Create a new adapter by subclassing BaseAdapter:

from reprompt.adapters.base import BaseAdapter
from reprompt.core.models import Prompt

class MyToolAdapter(BaseAdapter):
    name = "my-tool"
    default_session_path = "~/.my-tool/sessions"

    def parse_session(self, path):
        # Parse session file -> list[Prompt]
        ...

    def detect_installed(self):
        return Path(self.default_session_path).expanduser().exists()

Troubleshooting

NumPy conflict in Anaconda environments

If you see an error like:

A module that was compiled using NumPy 1.x cannot be run in NumPy 2.x

This happens when Anaconda's base environment has packages compiled against NumPy 1.x but a newer NumPy 2.x is installed. This is not a reprompt bug — it's an environment conflict.

Fix: Install reprompt in an isolated environment using pipx:

pip3 install --user pipx
pipx install reprompt-cli
reprompt scan

pipx creates a dedicated virtualenv for reprompt, avoiding conflicts with your system Python or Anaconda.

Roadmap

  • Prompt version control — track how your prompts evolve across iterations, with semantic diffing and per-version effectiveness scoring
  • More adapters — Codex CLI, Aider, Gemini CLI, Cursor
  • Team analytics — aggregate insights across team members (opt-in, anonymized)
  • Prompt recommendations — suggest better prompts based on your history and outcomes

Contributing

See CONTRIBUTING.md for development setup and guidelines.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

reprompt_cli-0.3.2.tar.gz (1.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

reprompt_cli-0.3.2-py3-none-any.whl (42.2 kB view details)

Uploaded Python 3

File details

Details for the file reprompt_cli-0.3.2.tar.gz.

File metadata

  • Download URL: reprompt_cli-0.3.2.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for reprompt_cli-0.3.2.tar.gz
Algorithm Hash digest
SHA256 493845530018bd601fa0b6a0efbc867b200e8ed609e1d3d3685b5936a5a07c21
MD5 4adffabe0c3db81b4605918b2229372c
BLAKE2b-256 70752ed39d3c746070640e25a97b819500fc2f511c4e797cd2bd4826241b29a1

See more details on using hashes here.

Provenance

The following attestation bundles were made for reprompt_cli-0.3.2.tar.gz:

Publisher: publish.yml on reprompt-dev/reprompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file reprompt_cli-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: reprompt_cli-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 42.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for reprompt_cli-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ed12de6175c474bbb5e87c87da8bc5ad444e217154031e26d83e3577e716cd44
MD5 5769bd11f49d242720e82557404ffaeb
BLAKE2b-256 9c63fbaf64a9bb40609d602f9197163c8f1f6b3b4ab93c45070995420a62af9a

See more details on using hashes here.

Provenance

The following attestation bundles were made for reprompt_cli-0.3.2-py3-none-any.whl:

Publisher: publish.yml on reprompt-dev/reprompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page