Adaptive Neural Knowledge System + PostToolUse compression hooks. Two-phase token optimization (retrieval + consumption) for Claude Code.
Project description
๐ง NeuralMind
Two-phase token optimization for Claude Code โ smart retrieval + tool-output compression in one package.
Most tools save tokens on what you fetch OR on what Claude sees back โ never both. NeuralMind v0.3.0 does both in one
pip install, plus learns your project patterns.
โก Two-phase optimization
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Phase 1: Retrieval โ what to fetch โ
โ neuralmind wakeup . โ ~365 tokens (vs 50K raw) โ
โ neuralmind query "?" โ ~800 tokens (vs 2,700 raw) โ
โ mcp: neuralmind_skeleton โ graph-backed file view โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Phase 2: Consumption โ what Claude actually sees โ
โ PostToolUse hooks compress Read/Bash/Grep output โ
โ File reads โ graph skeleton (~88% reduction) โ
โ Bash output โ errors + summary (~91% reduction) โ
โ Search results โ capped at 25 matches โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Combined effect: 5-10ร total reduction vs baseline Claude Code.
๐ฏ The Problem
You: "How does authentication work in my codebase?"
โ Traditional: Load entire codebase โ 50,000 tokens โ $0.15-$3.75/query
โ
NeuralMind: Smart context โ 766 tokens โ $0.002-$0.06/query
๐ฐ Real Savings
| Model | Without NeuralMind | With NeuralMind | Monthly Savings |
|---|---|---|---|
| Claude 3.5 Sonnet | $450/month | $7/month | $443 |
| GPT-4o | $750/month | $12/month | $738 |
| GPT-4.5 | $11,250/month | $180/month | $11,070 |
| Claude Opus | $2,250/month | $36/month | $2,214 |
Based on 100 queries/day. Pricing sources
๐ Quick Start
# Install
pip install neuralmind graphifyy
# Generate knowledge graph
cd your-project
graphify update .
# Build neural index
neuralmind build .
# (v0.2.0) Install PostToolUse hooks โ compresses Read/Bash/Grep
neuralmind install-hooks .
# Query your codebase
neuralmind query . "How does authentication work?"
# (v0.2.0) Skeleton view of a file without loading source
neuralmind skeleton tools/voiceover.py
โจ What's New in v0.3.2
Complete Learning Loop โ Collect โ Analyze โ Improve ๐
| Feature | Status | Details |
|---|---|---|
| Memory Collection | โ v0.3.0 | Local JSONL storage for queries (project + global) |
| Opt-in Consent | โ v0.3.0 | One-time TTY-only prompt, respects env vars |
| Pattern Learning | โ v0.3.2 | neuralmind learn . analyzes cooccurrence patterns |
| Smart Reranking | โ v0.3.2 | Improves context ranking by module relationships |
| Automatic Boosting | โ v0.3.2 | Next query applies learned patterns automatically |
How it works:
- Collect (v0.3.0) โ After queries, NeuralMind logs which modules matter
- Learn (v0.3.2) โ Run
neuralmind learn .to analyze patterns - Improve (v0.3.2) โ Next queries automatically boost related modules
- Repeat โ System gets smarter as you query more
Example: Auth + validation modules appear together? Next time you ask about auth, validation gets boosted in search results automatically.
- 100% local storage, no telemetry, fully under your control
- Zero overhead if patterns unavailable (graceful fallback)
- Works with any query pattern โ learns from actual usage
๐ See full guide in docs/brain_like_learning.md or Setup-Guide for all platforms
Quick Learning Setup
# After a few queries have been logged:
neuralmind learn .
# Example output:
# โ Learned 12 cooccurrence patterns
# โ Patterns saved to .neuralmind/learned_patterns.json
# โ Next query will apply learned patterns for improved retrieval
#
# Top cooccurrence patterns:
# community_0|community_1: 5 times
# community_1|community_2: 4 times
On your next neuralmind query, the reranker automatically boosts modules that co-occur with your context, improving relevance without any extra cost.
๐ Compatibility
NeuralMind has multiple components with different compatibility. Use what fits your workflow:
| Component | Works With | Notes |
|---|---|---|
CLI (build, query, search, wakeup, skeleton) |
โ Any environment | Pure Python โ IDE-agnostic |
MCP Server (neuralmind-mcp) |
โ
Claude Code โ Claude Desktop โ Cursor (with MCP) โ Cline โ Continue โ Any MCP client |
Exposes wakeup, query, search, skeleton, stats as MCP tools |
PostToolUse Hooks (install-hooks) |
โ Claude Code only | Uses Claude Code's hook system to compress Read/Bash/Grep output |
Git post-commit hook (init-hook) |
โ Any git workflow | Auto-rebuilds index on commit |
| Context export (copy-paste) | โ ChatGPT, Gemini, Antigravity, any LLM | neuralmind wakeup . | pbcopy |
Quick-start by tool
Claude Code โ full two-phase optimization
pip install neuralmind
neuralmind build .
neuralmind install-hooks . # enables PostToolUse compression
Cursor / Cline / Continue โ MCP server only
Add to your MCP config:
{
"mcpServers": {
"neuralmind": {
"command": "neuralmind-mcp"
}
}
}
Then call the tools (neuralmind_wakeup, neuralmind_query, neuralmind_search, neuralmind_skeleton) passing project_path as a parameter. Make sure to run neuralmind build . in your project first.
Antigravity / Windsurf / ChatGPT / Gemini โ CLI + copy-paste
neuralmind wakeup . # ~600 tokens of project context
neuralmind query . "your question here" # query-specific context
neuralmind skeleton src/auth/handlers.py # compact file view
Pipe any of these into your chat interface.
โจ Key Features
| Feature | Description |
|---|---|
| 4-Layer Context | Progressive disclosure โ only loads what's relevant |
| Semantic Search | Finds code by meaning, not just keywords |
| Query-Aware | Different questions get different context |
| CLI Tool | Simple commands: build, query, wakeup, search |
| MCP Server | Works with Claude Code, Claude Desktop, Cursor, Cline, Continue, and any MCP client |
| Auto-Updates | Git hooks and scheduled maintenance |
๐ Benchmarks
| Project | Nodes | Avg Token Reduction |
|---|---|---|
| cmmc20 (React/Node) | 241 | 65.6x |
| mempalace (Python) | 1,626 | 46.0x |
๐ง How It Works
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Layer 0: Project Identity (~100 tokens) - ALWAYS LOADED โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Layer 1: Architecture Summary (~300 tokens) - ALWAYS LOADED โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Layer 2: Relevant Modules (~300 tokens) - QUERY-SPECIFIC โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Layer 3: Semantic Search (~300 tokens) - QUERY-SPECIFIC โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Total: ~800-1,100 tokens vs 50,000+ for full codebase
๐ Use Cases
- Daily Development โ Get context for AI coding questions
- New Developer Onboarding โ Generate project overviews
- Code Review โ Understand related code quickly
- Documentation โ AI-assisted docs from actual code
- CI/CD Integration โ Auto-update context files
- IDE Integration โ MCP server for Claude/Cursor
๐ See full use cases and examples in USAGE.md
๐ฅ๏ธ CLI Commands
| Command | Purpose |
|---|---|
neuralmind build . |
Build/rebuild neural index |
neuralmind query . "..." |
Query with natural language |
neuralmind wakeup . |
Get project overview |
neuralmind search . "..." |
Direct semantic search |
neuralmind benchmark . |
Measure token reduction |
neuralmind stats . |
Show index statistics |
neuralmind learn . |
Run opt-in continual learning scaffold (MVP) |
neuralmind skeleton <file> |
v0.2.0 Compact graph-backed file view |
neuralmind install-hooks . |
v0.2.0 Install PostToolUse compression hooks (project) |
neuralmind install-hooks --global |
v0.2.0 Install hooks globally for all projects |
neuralmind install-hooks --uninstall |
v0.2.0 Remove hooks (preserves other tools' hooks) |
neuralmind init-hook . |
Install git post-commit hook (auto-rebuild on commit) |
๐ช PostToolUse Compression (v0.2.0)
NeuralMind ships with Claude Code hooks that compress tool outputs before the model sees them:
| Tool | Compression | Typical savings |
|---|---|---|
| Read | Replaces raw source with graph skeleton (functions, rationales, call graph) | ~88% |
| Bash | Keeps errors + last 3 lines + summary; drops routine output | ~91% |
| Grep | Caps at 25 matches + "N more hidden" pointer | varies |
Install per-project (recommended):
cd my-project
neuralmind install-hooks . # writes .claude/settings.json
Install globally (all projects):
neuralmind install-hooks --global # writes ~/.claude/settings.json
Bypass temporarily (for debugging):
NEURALMIND_BYPASS=1 claude-code ...
Uninstall cleanly (preserves other hooks):
neuralmind install-hooks --uninstall # project
neuralmind install-hooks --uninstall --global # global
The hook installer is idempotent and non-destructive โ existing hooks from other tools (Prettier, Black, etc.) are preserved.
Coming from Pith?
NeuralMind v0.2.0 provides full Pith-parity compression plus graph-backed retrieval โ both in one package. Migration:
# Remove Pith global hooks, then:
pip install neuralmind
neuralmind install-hooks --global
Unlike Pith's regex-based skeletonization, NeuralMind uses the semantic graph you've already built, so skeletons include rationales, call graphs, and cross-file edges that regex can't extract.
โฐ Scheduling Updates
Git Hook (Recommended)
# .git/hooks/post-commit
#!/bin/bash
graphify update . --quiet
neuralmind build . --quiet
Cron Job
# Daily at 6 AM
0 6 * * * cd /project && graphify update . && neuralmind build .
CI/CD
- run: pip install neuralmind graphifyy
- run: graphify update . && neuralmind build .
- run: neuralmind wakeup . > AI_CONTEXT.md
๐ See full scheduling guide in USAGE.md
๐ MCP Server Integration
For Claude Desktop or Cursor:
{
"mcpServers": {
"neuralmind": {
"command": "neuralmind-mcp",
"args": ["/path/to/project"]
}
}
}
Exposed MCP tools (v0.2.0):
| Tool | Purpose |
|---|---|
mcp__neuralmind__wakeup |
Minimal project overview (~365 tokens) |
mcp__neuralmind__query |
Natural language code query (~800 tokens) |
mcp__neuralmind__search |
Direct semantic search with scores |
mcp__neuralmind__skeleton |
v0.2.0 Compact file view (functions, rationales, calls) |
mcp__neuralmind__stats |
Index health |
mcp__neuralmind__benchmark |
Measure token reduction |
mcp__neuralmind__build |
Rebuild index |
Project-scoped auto-registration: drop a .mcp.json at your project root and Claude Code loads it on open:
{
"mcpServers": {
"neuralmind": {
"command": "neuralmind-mcp",
"args": ["."]
}
}
}
๐ Documentation
- USAGE.md โ Complete usage guide with examples
- Brain-like Continual Learning โ Opt-in memory + learning scaffolding
- Wiki โ Full documentation
- API Reference โ Python API docs
๐ค Contributing
See CONTRIBUTING.md for guidelines.
๐ License
MIT License โ see LICENSE for details.
โญ Star this repo if NeuralMind saves you money!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neuralmind-0.3.3.2.tar.gz.
File metadata
- Download URL: neuralmind-0.3.3.2.tar.gz
- Upload date:
- Size: 41.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d832926c7faf78e492721da973aa3572a2f8ae580d29287aba1c5022f1459eb0
|
|
| MD5 |
c813e29e3dc453c7a872c9965fbb6b8e
|
|
| BLAKE2b-256 |
109ed67e352bbed5cd7440d03733c15cda7d54156b74649a468a720b54c85c7e
|
Provenance
The following attestation bundles were made for neuralmind-0.3.3.2.tar.gz:
Publisher:
release.yml on dfrostar/neuralmind
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
neuralmind-0.3.3.2.tar.gz -
Subject digest:
d832926c7faf78e492721da973aa3572a2f8ae580d29287aba1c5022f1459eb0 - Sigstore transparency entry: 1343171839
- Sigstore integration time:
-
Permalink:
dfrostar/neuralmind@2a0fd4e96ac0ae87eb19b31edd98f4d67de38ab2 -
Branch / Tag:
refs/tags/v0.3.3.2 - Owner: https://github.com/dfrostar
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@2a0fd4e96ac0ae87eb19b31edd98f4d67de38ab2 -
Trigger Event:
push
-
Statement type:
File details
Details for the file neuralmind-0.3.3.2-py3-none-any.whl.
File metadata
- Download URL: neuralmind-0.3.3.2-py3-none-any.whl
- Upload date:
- Size: 44.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c9b0a830df9cb761b3b23108f8381c8dbdb59a22354355f636d107ed2d013829
|
|
| MD5 |
acc8371661b60d6f96795ffb448badc1
|
|
| BLAKE2b-256 |
00c159cbf0b89c470b11c75de4d375b17ce1aac91b802c530c2e452b0746b9a1
|
Provenance
The following attestation bundles were made for neuralmind-0.3.3.2-py3-none-any.whl:
Publisher:
release.yml on dfrostar/neuralmind
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
neuralmind-0.3.3.2-py3-none-any.whl -
Subject digest:
c9b0a830df9cb761b3b23108f8381c8dbdb59a22354355f636d107ed2d013829 - Sigstore transparency entry: 1343171848
- Sigstore integration time:
-
Permalink:
dfrostar/neuralmind@2a0fd4e96ac0ae87eb19b31edd98f4d67de38ab2 -
Branch / Tag:
refs/tags/v0.3.3.2 - Owner: https://github.com/dfrostar
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@2a0fd4e96ac0ae87eb19b31edd98f4d67de38ab2 -
Trigger Event:
push
-
Statement type: