Skip to main content

Discover, analyze, and evolve your best prompts from AI coding sessions

Project description

reprompt

CI PyPI version Python License: MIT

Discover, analyze, and evolve your best prompts from AI coding sessions.

repomix packs your code for AI. reprompt extracts insights from AI.

Every developer's AI session history contains reusable prompt patterns -- scattered across hundreds of session files. reprompt extracts them, deduplicates, analyzes frequency, and builds a personal prompt library that evolves over time.

Quick Start

pipx install reprompt-cli
reprompt scan
reprompt report
reprompt library

Terminal Report

reprompt -- AI Session Analytics
========================================

 Overview
 Total prompts:        1,247
 Unique (deduped):       832
 Sessions scanned:       156
 Sources: claude-code, openclaw

 Top Prompt Patterns
 #  | Pattern                  | Count | Category
 1  | fix the failing test...  |    42 | debug
 2  | add unit tests for...    |    38 | test
 3  | refactor X to use...     |    27 | refactor

Features

  • Auto-detection -- finds Claude Code and OpenClaw sessions automatically
  • Two-layer dedup -- SHA-256 exact + TF-IDF semantic similarity
  • Hot terms analysis -- TF-IDF discovers your most-used technical terms
  • K-means clustering -- groups similar prompts into themes
  • Prompt library -- extracts high-frequency patterns, auto-categorizes (debug/implement/test/review/refactor/explain/config)
  • Rich reports -- beautiful terminal output with tables and bar charts
  • Multiple formats -- terminal, JSON (for pipelines), Markdown (for docs)
  • Pluggable adapters -- add support for any AI coding tool
  • Prompt search -- find past prompts by keyword across all sessions
  • Zero config -- works out of the box, customize via env vars or TOML

How reprompt Compares

Feature reprompt prompt-manager agent-sessions cclog
Multi-tool support ✅ Claude, OpenClaw, + adapters ✅ Multiple ✅ Multiple ❌ Claude only
Exact dedup (SHA-256)
Semantic dedup (TF-IDF)
Hot terms analysis ✅ TF-IDF
K-means clustering
Pattern library ✅ Auto-categorized
CLI interface ✅ TUI ❌ macOS app
JSON/Markdown export
Pluggable adapters
Zero config

Supported AI Tools

Tool Status Session Path
Claude Code Supported ~/.claude/projects/
OpenClaw / OpenCode Supported ~/.openclaw/ + ~/.opencode/sessions/
Codex CLI Planned (v0.4) ~/.codex/
Aider Planned ~/.aider/
Gemini CLI Planned ~/.gemini/
Continue.dev Via MCP MCP protocol
Zed Via MCP MCP protocol
Cursor Planned --

Usage

# Scan all detected AI tools
reprompt scan

# Scan specific source
reprompt scan --source claude-code

# Scan custom path
reprompt scan --path ~/custom/sessions

# Rich terminal report
reprompt report

# JSON output (for CI/pipelines)
reprompt report --format json

# Search your prompt history
reprompt search "authentication"
reprompt search "debug" --limit 5

# View your prompt library
reprompt library

# Filter by category
reprompt library --category debug

# Export prompt library as Markdown
reprompt library prompts.md

# Database stats
reprompt status

# Auto-scan after sessions
reprompt install-hook

# Cleanup old data
reprompt purge --older-than 90d

MCP Server (Claude Code / Continue.dev / Zed)

reprompt includes an MCP server that exposes your prompt analytics as tools for AI coding assistants.

pip install reprompt-cli[mcp]

# Start the server
reprompt mcp-serve

Register in Claude Code — add to .mcp.json at project root:

{
  "mcpServers": {
    "reprompt": {
      "type": "stdio",
      "command": "reprompt",
      "args": ["mcp-serve"]
    }
  }
}

Available tools: search_prompts, get_prompt_library, get_best_prompts, get_trends, get_status, scan_sessions

Available resources: reprompt://status, reprompt://library

Configuration

Zero config by default. Customize with environment variables or TOML:

# Environment variables (prefix: REPROMPT_)
REPROMPT_EMBEDDING_BACKEND=ollama reprompt scan
REPROMPT_DB_PATH=~/custom/reprompt.db reprompt status
# ~/.config/reprompt/config.toml
[embedding]
backend = "tfidf"  # tfidf | ollama | local | openai

[storage]
db_path = "~/.local/share/reprompt/reprompt.db"

[dedup]
semantic_threshold = 0.85

[library]
min_frequency = 3

Optional Backends

pip install reprompt-cli[ollama]   # Ollama API embeddings
pip install reprompt-cli[local]    # sentence-transformers (CPU)
pip install reprompt-cli[openai]   # OpenAI API embeddings

Adding an Adapter

Create a new adapter by subclassing BaseAdapter:

from reprompt.adapters.base import BaseAdapter
from reprompt.core.models import Prompt

class MyToolAdapter(BaseAdapter):
    name = "my-tool"
    default_session_path = "~/.my-tool/sessions"

    def parse_session(self, path):
        # Parse session file -> list[Prompt]
        ...

    def detect_installed(self):
        return Path(self.default_session_path).expanduser().exists()

Troubleshooting

NumPy conflict in Anaconda environments

If you see an error like:

A module that was compiled using NumPy 1.x cannot be run in NumPy 2.x

This happens when Anaconda's base environment has packages compiled against NumPy 1.x but a newer NumPy 2.x is installed. This is not a reprompt bug — it's an environment conflict.

Fix: Install reprompt in an isolated environment using pipx:

pip3 install --user pipx
pipx install reprompt-cli
reprompt scan

pipx creates a dedicated virtualenv for reprompt, avoiding conflicts with your system Python or Anaconda.

Contributing

See CONTRIBUTING.md for development setup and guidelines.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

reprompt_cli-0.3.1.tar.gz (261.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

reprompt_cli-0.3.1-py3-none-any.whl (41.6 kB view details)

Uploaded Python 3

File details

Details for the file reprompt_cli-0.3.1.tar.gz.

File metadata

  • Download URL: reprompt_cli-0.3.1.tar.gz
  • Upload date:
  • Size: 261.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for reprompt_cli-0.3.1.tar.gz
Algorithm Hash digest
SHA256 377f3c197ec048008853cb1172daf78eee65e115e7c06b3af3a46578455bce98
MD5 ccea997fabd0c4d487c0605f8aeeb2d0
BLAKE2b-256 6153b906f05218012c791a37185be3c3cfbff7b9054ad5210e4fd7e8cf292442

See more details on using hashes here.

Provenance

The following attestation bundles were made for reprompt_cli-0.3.1.tar.gz:

Publisher: publish.yml on reprompt-dev/reprompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file reprompt_cli-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: reprompt_cli-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 41.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for reprompt_cli-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e3a8c148d2c935b05fb4011aea7d42ef62780ee8dd0894422452802ad0656e56
MD5 fbb809aad1828d8c68f9a480e8c26ae5
BLAKE2b-256 dafa9f545cbef453071c8dc53cec4b5ddd98f9ced3a8b5fbc63ea9cbd6a1e927

See more details on using hashes here.

Provenance

The following attestation bundles were made for reprompt_cli-0.3.1-py3-none-any.whl:

Publisher: publish.yml on reprompt-dev/reprompt

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page