Skip to main content

Local-first CLI code intelligence tool with LangChain-powered RAG

Project description

CodeSage

Local-first code intelligence CLI powered by Ollama. Search, analyze, and chat with your codebase using natural language.

Works with Claude Desktop, Cursor, and Windsurf via MCP.

Install

# Recommended: pipx (isolated environment)
pipx install pycodesage

# Or pip
pip install pycodesage
Detailed installation
# macOS
brew install pipx
pipx ensurepath

# Linux/Windows
python3 -m pip install --user pipx
python3 -m pipx ensurepath

# Install with specific Python version
pipx install --python python3.11 pycodesage

# Add optional features
pipx inject pycodesage "pycodesage[multi-language]"  # JS, TS, Go, Rust
pipx inject pycodesage "pycodesage[mcp]"             # MCP server

Requirements

Ollama must be running:

ollama pull qwen2.5-coder:7b
ollama pull qwen3-embedding
ollama serve

Usage

cd your-project
codesage init      # Initialize project
codesage index     # Build code index
codesage chat      # Interactive chat mode

Commands

Command Description
init Initialize project (detects languages, creates config)
index Build or update the code index
chat Interactive chat with code intelligence
mcp serve Start MCP server for AI IDE integration
mcp setup Show MCP configuration for your IDE
mcp test Test MCP server functionality

Chat Commands

Inside codesage chat, use these slash commands:

Search & Analysis

Command Description
/search <query> Semantic code search
/deep <query> Deep multi-agent analysis
/similar <element> Find similar code
/patterns [query] Show learned patterns

Planning & Review

Command Description
/plan <task> Generate implementation plan
/review [file] Review code changes
/security [path] Security analysis
/impact <element> Impact/blast radius analysis

Session

Command Description
/mode <mode> Switch mode (brainstorm / implement / review)
/context Show/modify context settings
/stats Show index statistics
/export [file] Export conversation
/clear Clear chat history
/help Show all commands
/exit or Ctrl+D Exit chat

MCP Setup

CodeSage works as an MCP server for AI IDEs. Run codesage mcp setup to get the configuration, or add this to your MCP client config:

{
  "mcpServers": {
    "codesage": {
      "command": "codesage",
      "args": ["mcp", "serve", "--global"]
    }
  }
}
MCP tools available
Tool Description
list_projects List all indexed projects (global mode)
get_developer_profile Your coding patterns and conventions
search_code Semantic code search with confidence scoring
get_file_context File content with definitions and security analysis
review_code Code review with static + LLM analysis
analyze_security Security vulnerability scanning
get_stats Index statistics and storage metrics
explain_concept Understand how a concept is implemented
suggest_approach Implementation guidance for a coding task
trace_flow Trace callers/callees through the dependency graph
find_examples Find usage examples of a pattern or function
recommend_pattern Pattern recommendations from learned memory
Client-specific setup

Claude Desktop: Add config above to claude_desktop_config.json

Cursor: Settings → Features → MCP Servers → Add config

Windsurf: Settings → MCP → Add Server. Command: codesage, Args: mcp serve --global

Configuration

Stored in .codesage/config.yaml (created by codesage init):

project_name: my-project
languages:
  - python
  - typescript

llm:
  provider: ollama
  model: qwen2.5-coder:7b
  embedding_model: qwen3-embedding

exclude_dirs:
  - node_modules
  - venv
  - .git
All configuration options
# LLM settings
llm:
  provider: ollama          # ollama, openai, anthropic
  model: qwen2.5-coder:7b
  embedding_model: qwen3-embedding
  base_url: http://localhost:11434
  temperature: 0.3

# Storage
storage:
  vector_backend: lancedb
  use_graph: true

# Security scanning
security:
  enabled: true
  severity_threshold: medium

# Developer memory
memory:
  enabled: true
  learn_on_index: true

# Performance tuning
performance:
  embedding_batch_size: 200
  embedding_cache_size: 1000
  cache_enabled: true

Language Support

  • Python (built-in)
  • JavaScript, TypeScript, Go, Rust (with pycodesage[multi-language])

Development

git clone https://github.com/keshavashiya/codesage.git
cd codesage
pip install -e ".[dev]"
pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycodesage-0.3.1.tar.gz (197.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycodesage-0.3.1-py3-none-any.whl (237.6 kB view details)

Uploaded Python 3

File details

Details for the file pycodesage-0.3.1.tar.gz.

File metadata

  • Download URL: pycodesage-0.3.1.tar.gz
  • Upload date:
  • Size: 197.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for pycodesage-0.3.1.tar.gz
Algorithm Hash digest
SHA256 795e98464a91288343b39150a3e68162a5ea6f26d7d0b4c4303ebe1e5ddcc00a
MD5 56b68fc83d0a8db05abf97dcbcb8e1c2
BLAKE2b-256 cfd85545e7fa9e592ba70cce2aed2d73dac7a8ee20ab7f206c9f62e9785e6c24

See more details on using hashes here.

File details

Details for the file pycodesage-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: pycodesage-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 237.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for pycodesage-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2040f4ce7f4090cb4a9d9f2c6014d5b1414df27f069bb9510171cc905fbe21d2
MD5 48bed5b4bb7d172797e57c33acf4352a
BLAKE2b-256 e253615f32edd889df547e1fbc8f737a1b5044245df32cd516573d78c9aad617

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page