Skip to main content

Share code with LLMs via Model Context Protocol or clipboard. Rule-based customization enables easy switching between different tasks (like code review and documentation). Code outlining support is included as a standard feature.

Project description

LLM Context

License PyPI version Downloads

Reduce friction when providing context to LLMs. Share relevant project files instantly through smart selection and rule-based filtering.

The Problem

Getting project context into LLM chats is tedious:

  • Manually copying/pasting files takes forever
  • Hard to identify which files are relevant
  • Including too much hits context limits, too little misses important details
  • AI requests for additional files require manual fetching
  • Repeating this process for every conversation

The Solution

lc-select # Smart file selection
lc-context # Instant formatted context
# Paste and work - AI can access additional files seamlessly

Result: From "I need to share my project" to productive AI collaboration in seconds.

Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7 and 4.0), as well as Groks (3 and 4), using LLM Context itself to share code during development. All code in the repository is heavily human-curated (by me 😇, @restlessronin).

Installation

uv tool install "llm-context>=0.5.0"

Quick Start

Basic Usage

# One-time setup
cd your-project
lc-init
# Daily usage
lc-select
lc-context

MCP Integration (Recommended)

{
  "mcpServers": {
    "llm-context": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}

With MCP, AI can access additional files directly during conversations.

Project Customization

# Create project-specific filters
cat > .llm-context/rules/flt-repo-base.md << 'EOF'
---
compose:
  filters: [lc/flt-base]
gitignores:
  full-files: ["*.md", "/tests", "/node_modules"]
---
EOF
# Customize main development rule
cat > .llm-context/rules/prm-code.md << 'EOF'
---
instructions: [lc/ins-developer, lc/sty-python]
compose:
  filters: [flt-repo-base]
  excerpters: [lc/exc-base]
---
Additional project-specific guidelines and context.
EOF

Core Commands

Command Purpose
lc-init Initialize project configuration
lc-select Select files based on current rule
lc-context Generate and copy context
lc-context -nt Generate context for non-MCP environments
lc-set-rule <name> Switch between rules
lc-missing Handle file and context requests (non-MCP)

Rule System

Rules use a systematic five-category structure:

  • Prompt Rules (prm-): Generate project contexts (e.g., lc/prm-developer, lc/prm-rule-create)
  • Filter Rules (flt-): Control file inclusion (e.g., lc/flt-base, lc/flt-no-files)
  • Instruction Rules (ins-): Provide guidelines (e.g., lc/ins-developer, lc/ins-rule-framework)
  • Style Rules (sty-): Enforce coding standards (e.g., lc/sty-python, lc/sty-code)
  • Excerpt Rules (exc-): Configure extractions for context reduction (e.g., lc/exc-base)

Example Rule

---
description: "Debug API authentication issues"
compose:
  filters: [lc/flt-no-files]
  excerpters: [lc/exc-base]
also-include:
  full-files: ["/src/auth/**", "/tests/auth/**"]
---
Focus on authentication system and related tests.

Workflow Patterns

Daily Development

lc-set-rule lc/prm-developer
lc-select
lc-context
# AI can review changes, access additional files as needed

Focused Tasks

# Let AI help create minimal context
lc-set-rule lc/prm-rule-create
lc-context -nt
# Work with AI to create task-specific rule using tmp-prm- prefix

MCP Benefits

  • Code review: AI examines your changes for completeness/correctness
  • Additional files: AI accesses initially excluded files when needed
  • Change tracking: See what's been modified during conversations
  • Zero friction: No manual file operations during development discussions

Key Features

  • Smart File Selection: Rules automatically include/exclude appropriate files
  • Instant Context Generation: Formatted context copied to clipboard in seconds
  • MCP Integration: AI can access additional files without manual intervention
  • Systematic Rule Organization: Five-category system for clear rule composition
  • AI-Assisted Rule Creation: Let AI help create minimal context for specific tasks
  • Code Excerpting: Extractions of significant content to reduce context while preserving structure

Learn More

License

Apache License, Version 2.0. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_llm_context-0.5.2.tar.gz (43.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iflow_mcp_llm_context-0.5.2-py3-none-any.whl (66.0 kB view details)

Uploaded Python 3

File details

Details for the file iflow_mcp_llm_context-0.5.2.tar.gz.

File metadata

File hashes

Hashes for iflow_mcp_llm_context-0.5.2.tar.gz
Algorithm Hash digest
SHA256 62be7bbe896ec6012669c164cc584677a4ae54b12b118423cc599dff50d7e666
MD5 0e3ae28c74bc9b52ee0bd02218b0f4d9
BLAKE2b-256 0fa4007d3562651ab73e51548453b4e3f01197495b1ca94b8c36847e60f30277

See more details on using hashes here.

File details

Details for the file iflow_mcp_llm_context-0.5.2-py3-none-any.whl.

File metadata

File hashes

Hashes for iflow_mcp_llm_context-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 19ec27a0ba78cf21529ce010cccbd3f8736fbd644d9b39907083fdfd0d57abf3
MD5 4ebc0daa46ba5cef958dc1fb6a74a7de
BLAKE2b-256 2b5c09d0be12b88ca771c14654e7eb1fb8bca672adf0caaff440b83f0f65dfac

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page