Skip to main content

Ultra-fast intelligent code analysis tool for LLM context generation with Tree-sitter

Project description

🔍 CodeLens - Supercharge Your LLM Coding Experience

CodeLens is your AI coding assistant's best friend - an intelligent code analysis tool that transforms your codebase into LLM-optimized context. Stop wasting tokens on irrelevant files or struggling to explain your project structure. CodeLens does the heavy lifting, so you can focus on building great software.

PyPI version Python Versions Downloads


🚀 Why CodeLens?

  • Save time and tokens: Automatically extract the most relevant code context for your LLM
  • Get better answers: Provide AI with structured insights about your codebase architecture
  • Seamless LLM integration: One-click sharing with Claude, ChatGPT, Gemini and other LLMs
  • Work smarter: Identify core files, entry points, and dependencies automatically
  • Maintain with ease: Track TODOs, complexity hotspots, and technical debt

✨ Features

  • Multi-language analysis: Deep insights for Python, JavaScript/TypeScript, and SQL codebases
  • Direct LLM integration: Send analysis directly to Claude, ChatGPT, or Gemini with one click
  • Smart code extraction: Identifies core files, entry points, and critical dependencies
  • Interactive selection: Choose exactly which files and directories to analyze
  • Token optimization: Splits large files into perfectly-sized chunks for LLM context windows
  • Complexity metrics: Highlights complex functions and classes that need attention
  • Maintenance tracking: Collects TODOs, FIXMEs, and technical debt indicators
  • SQL database analysis: Examines stored procedures, views, and functions
  • Pre-commit integration: Automatically runs tests before committing

📦 Installation

pip install llm-code-lens

That's it! No complex configuration needed.


🎮 Usage

Quick Start

Simply run:

llmcl

This launches the interactive interface where you can navigate your project, select files, and configure analysis options with just a few keystrokes.

Workflow

  1. Select files: Navigate your project and choose what to analyze
  2. Configure options: Set output format, LLM provider, and other settings
  3. Run analysis: CodeLens examines your code and generates insights
  4. Send to LLM: With one click, send everything to your preferred AI assistant

LLM Integration (Enhanced in v0.5.15!)

CodeLens now integrates directly with popular LLM providers:

llmcl --open-in-llm claude

Or select your provider in the interactive menu:

  • Claude: Optimized format for Anthropic's Claude
  • ChatGPT: Perfect context for OpenAI's models
  • Gemini: Formatted for Google's Gemini
  • None: Skip browser opening

When analysis completes, CodeLens:

  1. Opens your chosen LLM in your default browser
  2. Copies the complete analysis to your clipboard
  3. Provides a system prompt optimized for code understanding

Just paste and start asking questions about your code!

New in v0.5.15: Enhanced AI Workflow

CodeLens now provides an optimized workflow for AI code assistance:

  1. System Context: Automatically includes your OS, Python version, and architecture
  2. Smart Editing: AI will ask for current files before suggesting changes (ensuring 100% accuracy)
  3. Effortless Updates: All suggestions use search-and-replace format for quick implementation
  4. Progress Tracking: Real-time progress bars show exactly what's happening during analysis

The enhanced system prompt ensures AI assistants provide more accurate, contextual suggestions that work perfectly with your specific development environment.

Interactive Interface

The intuitive terminal interface lets you:

  • See real-time progress with detailed progress bars and file-by-file indicators
  • Navigate with arrow keys (↑↓←→)
  • Watch progress in real-time with animated progress bars and current file indicators
  • Select files and directories with Space (includes all sub-elements)
  • Switch sections with Tab
  • Configure options with function keys (F1-F6)
  • Confirm with Enter
  • Cancel with Escape or Q

All your settings persist between runs, so you can quickly analyze the same files again.

Selection States

  • [+] - Included file or directory
  • [*] - Explicitly selected file or directory
  • [-] - Excluded file or directory

Command Line Options

For CI/CD pipelines or scripting:

llmcl --format json --full --open-in-llm claude

Full options list:

  • --output/-o: Output directory (default: .codelens)
  • --format/-f: Output format (txt or json)
  • --full: Export complete file contents
  • --debug: Enable detailed logging
  • --sql-server: SQL Server connection string
  • --sql-database: Database to analyze
  • --open-in-llm: LLM provider to open results in

📊 What You Get

CodeLens generates a comprehensive analysis in the .codelens directory:

1. Project Overview

  • Total files, lines of code, and complexity metrics
  • Visual project tree structure showing your codebase hierarchy
  • Language distribution and project structure
  • Entry points and core files identification
  • System environment context (OS, Python version, architecture)

2. Smart Insights

  • Architectural patterns detected
  • Potential code smells and improvement areas
  • Dependency relationships and import graphs

3. File-by-File Analysis

  • Function and class inventories with complexity scores
  • Documentation coverage and quality assessment
  • TODOs and technical debt indicators

4. Full Content Export (Optional)

  • Complete file contents split into token-optimized chunks
  • Perfect for providing full context to your LLM

5. SQL Analysis (If Configured)

  • Stored procedures, views, and functions inventory
  • Object dependencies and relationships
  • Parameter analysis and complexity metrics

🛠️ Configuration

Pre-commit Integration

Set up pre-commit hooks to ensure quality:

python scripts/install-hooks.py

This automatically runs tests before each commit.

SQL Server Configuration

Three ways to configure SQL analysis:

  1. Environment Variables:

    export MSSQL_SERVER=your_server
    export MSSQL_DATABASE=your_database
    
  2. Command Line:

    llmcl --sql-server "server_name" --sql-database "database_name"
    
  3. Configuration File: Create sql-config.json and use:

    llmcl --sql-config sql-config.json
    

💡 Use Cases

For Developers

  • Onboarding to new projects: Quickly understand unfamiliar codebases
  • Refactoring planning: Identify complex areas that need attention
  • Technical debt management: Track TODOs and maintenance needs
  • Architecture discussions: Generate insights about code structure

For LLM Interactions

  • Bug fixing: Provide perfect context for debugging issues
  • Feature development: Help LLMs understand where and how to add features
  • Code reviews: Get AI assistance with reviewing complex changes
  • Documentation: Generate comprehensive docs from code analysis

For Teams

  • Knowledge sharing: Create shareable insights about project structure
  • Consistent context: Ensure everyone provides similar context to LLMs
  • Codebase health: Track metrics over time to measure improvement
  • SQL analysis: Understand database objects without direct access

🧩 SQL Server Integration

CodeLens provides deep analysis of SQL Server databases:

Prerequisites

  • Microsoft ODBC Driver for SQL Server
  • Appropriate database permissions

What You Get

  • Complete inventory of stored procedures, views, and functions
  • Parameter analysis and usage patterns
  • Complexity metrics and dependency mapping
  • Full object definitions with the --full flag

Security Best Practices

  • Use environment variables for credentials
  • Consider integrated security when possible
  • Apply least-privilege principles for analysis

🔧 Development

Setting up the Environment

git clone https://github.com/SikamikanikoBG/codelens.git
cd codelens
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"

Running Tests

pytest

🤝 Contributing

We welcome contributions! To get started:

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for new functionality
  4. Submit a pull request with a clear description

📄 License

This project is licensed under the MIT License. See the LICENSE file for details.


🆘 Support

For issues or feature requests, please visit our GitHub Issues.


🌟 Star Us on GitHub!

If CodeLens has helped you, please consider giving us a star on GitHub. It helps others discover the tool and supports its continued development.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_code_lens-0.6.1.tar.gz (83.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_code_lens-0.6.1-py3-none-any.whl (65.2 kB view details)

Uploaded Python 3

File details

Details for the file llm_code_lens-0.6.1.tar.gz.

File metadata

  • Download URL: llm_code_lens-0.6.1.tar.gz
  • Upload date:
  • Size: 83.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for llm_code_lens-0.6.1.tar.gz
Algorithm Hash digest
SHA256 1ea1af5bf052096e03bf16d9007b823842f49aec34ea10ed3ae7d5a9daab3902
MD5 6bda633c003703df28618830dee93f70
BLAKE2b-256 21b0c73e7e64bc47c7b43d6cba560d2c404bec86a1eeda8bcd22f81dae44a017

See more details on using hashes here.

File details

Details for the file llm_code_lens-0.6.1-py3-none-any.whl.

File metadata

  • Download URL: llm_code_lens-0.6.1-py3-none-any.whl
  • Upload date:
  • Size: 65.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for llm_code_lens-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4e419c37a4e5bcc50ea1b67d99ea6b5643227ba0b566b03d5be9916035f630cc
MD5 9872ae748cf2ffd67cf84b451a34418e
BLAKE2b-256 d788a4d0cd122d3fbd5e013180d3120de8dee6fa45bee112e3f78b02971458a0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page