Skip to main content

Local-first CLI code intelligence tool with LangChain-powered RAG

Project description

CodeSage

Local-first code intelligence CLI powered by Ollama. Search, analyze, and chat with your codebase using natural language.

Works with Claude Desktop, Cursor, and Windsurf via MCP.

Install

# Recommended: pipx (isolated environment)
pipx install pycodesage

# Or pip
pip install pycodesage
Detailed installation
# macOS
brew install pipx
pipx ensurepath

# Linux/Windows
python3 -m pip install --user pipx
python3 -m pipx ensurepath

# Install with specific Python version
pipx install --python python3.11 pycodesage

# Add optional features
pipx inject pycodesage "pycodesage[multi-language]"  # JS, TS, Go, Rust
pipx inject pycodesage "pycodesage[mcp]"             # MCP server

Requirements

Ollama must be running:

ollama pull qwen2.5-coder:7b
ollama pull qwen3-embedding
ollama serve

Usage

cd your-project
codesage init      # Initialize project
codesage index     # Build code index
codesage chat      # Interactive chat mode

Commands

Command Description
init Initialize project (detects languages, creates config)
index Build or update the code index
chat Interactive chat with code intelligence
mcp serve Start MCP server for AI IDE integration
mcp setup Show MCP configuration for your IDE
mcp test Test MCP server functionality

Chat Commands

Inside codesage chat, use these slash commands:

Search & Analysis

Command Description
/search <query> Semantic code search
/deep <query> Deep multi-agent analysis
/similar <element> Find similar code
/patterns [query] Show learned patterns

Planning & Review

Command Description
/plan <task> Generate implementation plan
/review [file] Review code changes
/security [path] Security analysis
/impact <element> Impact/blast radius analysis

Session

Command Description
/mode <mode> Switch mode (brainstorm / implement / review)
/context Show/modify context settings
/stats Show index statistics
/export [file] Export conversation
/clear Clear chat history
/help Show all commands
/exit or Ctrl+D Exit chat

MCP Setup

CodeSage works as an MCP server for AI IDEs. Run codesage mcp setup to get the configuration, or add this to your MCP client config:

{
  "mcpServers": {
    "codesage": {
      "command": "codesage",
      "args": ["mcp", "serve", "--global"]
    }
  }
}
MCP tools available
Tool Description
list_projects List all indexed projects (global mode)
get_developer_profile Your coding patterns and conventions
search_code Semantic code search with confidence scoring
get_file_context File content with definitions and security analysis
review_code Code review with static + LLM analysis
analyze_security Security vulnerability scanning
get_stats Index statistics and storage metrics
explain_concept Understand how a concept is implemented
suggest_approach Implementation guidance for a coding task
trace_flow Trace callers/callees through the dependency graph
find_examples Find usage examples of a pattern or function
recommend_pattern Pattern recommendations from learned memory
Client-specific setup

Claude Desktop: Add config above to claude_desktop_config.json

Cursor: Settings → Features → MCP Servers → Add config

Windsurf: Settings → MCP → Add Server. Command: codesage, Args: mcp serve --global

Configuration

Stored in .codesage/config.yaml (created by codesage init):

project_name: my-project
languages:
  - python
  - typescript

llm:
  provider: ollama
  model: qwen2.5-coder:7b
  embedding_model: qwen3-embedding

exclude_dirs:
  - node_modules
  - venv
  - .git
All configuration options
# LLM settings
llm:
  provider: ollama          # ollama, openai, anthropic
  model: qwen2.5-coder:7b
  embedding_model: qwen3-embedding
  base_url: http://localhost:11434
  temperature: 0.3

# Storage
storage:
  vector_backend: lancedb
  use_graph: true

# Security scanning
security:
  enabled: true
  severity_threshold: medium

# Developer memory
memory:
  enabled: true
  learn_on_index: true

# Performance tuning
performance:
  embedding_batch_size: 200
  embedding_cache_size: 1000
  cache_enabled: true

Language Support

  • Python (built-in)
  • JavaScript, TypeScript, Go, Rust (with pycodesage[multi-language])

Development

git clone https://github.com/keshavashiya/codesage.git
cd codesage
pip install -e ".[dev]"
pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycodesage-0.3.0.tar.gz (193.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycodesage-0.3.0-py3-none-any.whl (233.1 kB view details)

Uploaded Python 3

File details

Details for the file pycodesage-0.3.0.tar.gz.

File metadata

  • Download URL: pycodesage-0.3.0.tar.gz
  • Upload date:
  • Size: 193.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for pycodesage-0.3.0.tar.gz
Algorithm Hash digest
SHA256 aed71db2afbaed04783e1b0e18d0fd5d3c6809dd0f08ec1ef7a0fc0c13f44efa
MD5 87595159203d6bfdc215cd5813818a70
BLAKE2b-256 8d9b231920f85d51f1f26ab26f65868afae92b3e67a4875d3a6ae1ab13cc0e57

See more details on using hashes here.

File details

Details for the file pycodesage-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: pycodesage-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 233.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for pycodesage-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c59a06c3edfe79ccced38cf0dad2f09a88aea074a7e1a0ea90bac05168c0b5cb
MD5 2ee410aea3581f74044755263a23a92e
BLAKE2b-256 18f6d5a9feaf74d88a52bfef9bc57b7231f473be66063f11c2ae99d373b9216f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page