Skip to main content

Generate optimized LLM context from Python library APIs — CLI, skill, and MCP server

Project description

libcontext

CI codecov PyPI version Python License: MIT Ruff Typed

Make your AI coding assistant aware of any Python library's API — on demand, not always-on.

libcontext inspects any installed Python package via static AST analysis (no code execution) and generates compact Markdown API references. It integrates with Claude Code (via a /lib skill) and VS Code Copilot (via an MCP server) to provide progressive disclosure — only loading API context when you actually need it, avoiding context window pollution.

Why This Exists

When you ask an AI assistant how to use a library, the quality of the output depends entirely on what the model knows about that library's API. For many real-world scenarios, the model is working blind:

  • Internal / private libraries — Zero training data exists. The model has never seen the API.
  • Niche open-source packages — Sparse or outdated training data leads to hallucinated methods and wrong signatures.
  • New versions of any library — Training data has a cutoff. The model knows v2, you're using v3.

Dumping entire API references into always-on instruction files (like copilot-instructions.md or CLAUDE.md) wastes context window on every interaction — even when you're not using that library. Research (ReadMe.LLM, UC Berkeley 2025) confirms that excessive context triggers hallucinations and degrades output quality.

libcontext solves this with progressive disclosure: overview first, then drill into specific modules only when needed.

When libcontext makes the biggest difference

Scenario Impact Why
Internal / private libraries Critical Zero training data exists for proprietary code
Niche open-source packages High Sparse training data leads to hallucinated methods
New versions of any library High Training cutoff — the LLM knows v2, you're using v3
Popular, stable libraries Low The LLM already has good knowledge from training data

Quick Start

# Install globally with uv (recommended — available in all projects)
uv tool install libcontext

# Install the /lib skill into your Claude Code project
libctx install --skills

# Now in Claude Code, just type:
#   /lib requests
# Claude will progressively discover the API for you

For VS Code with MCP support:

uv tool install "libcontext[mcp]"
libctx install --mcp --target vscode

How It Works

Progressive Disclosure (Skill / MCP)

Instead of dumping everything upfront, libcontext follows a progressive workflow:

Step 1: Overview          Step 2: Drill down          Step 3: Search
libctx inspect requests   libctx inspect requests     libctx inspect requests
  --overview                --module requests.api       --search Session

  Module list with          Full signatures,            Find specific
  class/function names      docstrings, parameters      classes or methods
  (no signatures)           for one module              across all modules

The /lib skill (Claude Code) and MCP server (VS Code / Cursor) automate this workflow — the AI assistant decides what to inspect based on the task at hand.

Direct CLI Usage

# Full API reference to stdout
libctx inspect requests

# Compact overview — module names with class/function names
libctx inspect requests --overview -q

# Detailed API for a single module
libctx inspect requests --module requests.api -q

# Search for a specific class or function
libctx inspect requests --search Session -q

# Write to a file with marker injection
libctx inspect requests -o .github/copilot-instructions.md

# Multiple libraries at once
libctx inspect requests httpx pydantic -o context.md

# JSON output (programmatic consumption)
libctx inspect requests --format json
libctx inspect requests --search Session --format json

# Filter search by type
libctx inspect requests --search Session --type class -q

# Compare two API snapshots
libctx inspect requests --format json > old.json
# ... upgrade requests ...
libctx inspect requests --format json > new.json
libctx diff old.json new.json

# Bypass disk cache
libctx inspect requests --no-cache

# Clear all cached API data
libctx cache clear

AST Analysis

  1. Parsing — Reads .py and .pyi source files using Python's ast module. No code is ever executed.
  2. Stub merging — Discovers colocated and standalone stub packages; merges signatures from stubs with docstrings from sources.
  3. Extraction — Classes, functions, methods, parameters, type annotations, decorators, type aliases, and docstrings.
  4. Compact rendering — Structured Markdown (or JSON) optimised for LLM context windows.
  5. Disk cache — Results are cached on disk and revalidated via (version, mtime, file_count) to avoid re-parsing unchanged packages.

Installation

Global install (recommended)

Install once, use across all projects — no per-project dependency needed:

uv tool install libcontext              # CLI only
uv tool install "libcontext[mcp]"       # with MCP server (requires Python 3.10+)

Update later with uv tool upgrade libcontext.

One-off usage without installing: uvx --from libcontext libctx inspect requests

Per-project install

If you prefer to add libcontext as a project dependency:

uv add libcontext           # basic
uv add libcontext[mcp]      # with MCP server

Or with pip:

pip install libcontext
pip install libcontext[mcp]

Development

git clone https://github.com/Syclaw/libcontext.git
cd libcontext
uv sync --all-extras

Integration Setup

The install command configures your project for AI-assisted library discovery:

# Claude Code — install the /lib skill
libctx install --skills

# Claude Code — install MCP server config
libctx install --mcp

# VS Code / Cursor — install MCP server config
libctx install --mcp --target vscode

# GitHub Copilot — install the skill
libctx install --skills --target github

# Everything at once
libctx install --all --target all
Flag What it installs
--skills /lib skill for on-demand API discovery
--mcp MCP server configuration for tool-based access
--all Both skills and MCP
Target Skills location MCP location
claude (default) .claude/skills/lib/SKILL.md .mcp.json
github .github/skills/lib/SKILL.md
vscode .vscode/mcp.json

Using the /lib Skill (Claude Code)

After libctx install --skills, type /lib <package> in Claude Code:

/lib requests              → overview, then drill into modules
/lib requests requests.api → jump straight to a specific module

Claude will automatically run libctx commands to discover the API progressively.

Using the MCP Server

After libctx install --mcp, the MCP server provides tools:

  • get_package_overview — structural overview of a package
  • get_module_api — detailed API for a single module
  • search_api — search by name or docstring (with optional kind filter and format for JSON output)
  • get_api_json — full package or single-module API as structured JSON
  • diff_api — compare two API snapshots and report changes with breaking change detection
  • refresh_cache — clear both in-memory and disk caches

Python API

from libcontext import collect_package, render_package

# Full API reference
pkg = collect_package("requests")
print(render_package(pkg))
from libcontext import collect_package, render_package_overview, render_module, search_package

pkg = collect_package("requests")

# Overview — module names with class/function names
print(render_package_overview(pkg))

# Single module — full signatures and docstrings
for mod in pkg.non_empty_modules:
    if mod.name == "requests.api":
        print(render_module(mod))

# Search — find specific classes or functions
print(search_package(pkg, "Session"))

# Search with type filter
print(search_package(pkg, "Session", kind="class"))
import dataclasses, json
from libcontext import collect_package, diff_packages, render_diff
from libcontext.models import PackageInfo, _serialize_envelope

# JSON serialization (roundtrip-safe)
pkg = collect_package("requests")
data = _serialize_envelope(dataclasses.asdict(pkg))
print(json.dumps(data, indent=2))

# Reconstruct from JSON
pkg_restored = PackageInfo.from_dict(data["data"])

# API diff between versions
old_pkg = PackageInfo.from_dict(old_data)
new_pkg = PackageInfo.from_dict(new_data)
result = diff_packages(old_pkg, new_pkg)
print(render_diff(result))

Configuration (Optional)

Library authors can customise what libcontext exposes by adding a [tool.libcontext] section to their pyproject.toml. The library does not need to depend on libcontext.

[tool.libcontext]
include_modules = ["mylib.core", "mylib.models"]
exclude_modules = ["mylib._internal", "mylib.tests"]
include_private = false
max_readme_lines = 150
extra_context = """
This library uses the Repository pattern for data access.
All async operations use httpx internally.
"""

Architecture

Module Role
models.py Dataclasses for packages, modules, classes, functions, and diff results
inspector.py Static AST analysis — signatures, docstrings, decorators, type aliases
collector.py Package discovery, module collection, stub merging, and disk cache integration
config.py Reads [tool.libcontext] from pyproject.toml
renderer.py LLM-optimised Markdown generation (full, overview, module, search, diff)
diff.py API diff between two package versions with breaking change detection
cache.py Persistent disk cache with mtime/file-count invalidation and LRU eviction
cli.py CLI entry point — inspect, install, diff, and cache subcommands
mcp_server.py MCP server for VS Code / Cursor integration (optional)

Development

uv sync --all-extras
uv run pytest --cov=libcontext
uv run ruff check src/ tests/
uv run ruff format src/ tests/
uv run mypy src/libcontext

See CONTRIBUTING.md for detailed contribution guidelines.

Dependencies

See DEPENDENCIES.md for the full list of dependencies and their licenses.

License

MIT — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

libcontext-0.3.0.tar.gz (89.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

libcontext-0.3.0-py3-none-any.whl (48.6 kB view details)

Uploaded Python 3

File details

Details for the file libcontext-0.3.0.tar.gz.

File metadata

  • Download URL: libcontext-0.3.0.tar.gz
  • Upload date:
  • Size: 89.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for libcontext-0.3.0.tar.gz
Algorithm Hash digest
SHA256 25dbb823dc5882e5aa0b30992205707ffee03cd555cb7cea9a483abf280c9cea
MD5 b67fad69d33f6faf4b848e71a638d71b
BLAKE2b-256 f9a8c96e881d7ca6075d9c8e739b2b09f39d612cfe721dc210c3de50643006b3

See more details on using hashes here.

Provenance

The following attestation bundles were made for libcontext-0.3.0.tar.gz:

Publisher: release.yml on Syclaw/libcontext

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file libcontext-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: libcontext-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 48.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for libcontext-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bd590863ae184d25dc041c44a3edcd6645b12fd56ff33dabfd7b3bd785fb7f8e
MD5 07eb2778bac70cdde89a30a1970d6a78
BLAKE2b-256 5afb519ada914fc945bbc7e5ceb9210ffc86ddb898fda618a06a45629a898777

See more details on using hashes here.

Provenance

The following attestation bundles were made for libcontext-0.3.0-py3-none-any.whl:

Publisher: release.yml on Syclaw/libcontext

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page