Skip to main content

A tool to help with copying and pasting code context into LLM chats

Project description

LLM Code Context

LLM Code Context is a Python-based tool designed to streamline the process of sharing code context with Large Language Models (LLMs) using a standard Chat UI. It allows developers to easily select, format, and copy relevant code snippets and project structure information, enhancing the quality of interactions with AI assistants in coding tasks.

This project was developed with significant input from Claude 3 Opus and Claude 3.5 Sonnet. All of the code that makes it into the repo is human curated (by me ๐Ÿ˜‡, @restlessronin).

Features

  • File Selection: Offers a command-line interface for selecting files from your project.
  • Intelligent Ignoring: Respects .gitignore rules and additional custom ignore patterns to exclude irrelevant files.
  • Folder Structure Visualization: Generates a textual representation of your project's folder structure.
  • Clipboard Integration: Automatically copies the generated context to your clipboard for easy pasting.
  • Optional Technical Summary: Allows inclusion of a markdown file summarizing the project's technical aspects.

Installation

Using pipx (Recommended)

pipx is a tool to help you install and run end-user CLI applications written in Python.

  1. If you haven't installed pipx yet, follow the installation instructions in the pipx documentation.
  2. Once pipx is installed, you can install LLM Code Context:
    pipx install llm-code-context
    

This will install LLM Code Context in an isolated environment and make its commands available in your shell.

Usage

LLM Code Context offers several command-line tools, each designed for a specific task. All commands should be run from the root directory of your project, where the primary .gitignore file is located.

Here are the main commands:

# Select all files which are not gitignored
lcc-select
# Generate full context (including folder structure and summary), using selected files
lcc-gencontext
# Generate full text contents from a list of paths in the clipboard
lcc-genfiles

Typical workflow

Let's say that you are collaborating with an LLM on a code repo. Use a system or custom prompt similar to this custom-prompt.md.

Provide context for your chat.

  1. Navigate to your project's root directory in the terminal.
  2. Edit the project configuration file .llm-code-context/config.json to add any files to the "gitignores" key that should be in git but may not be useful for code context (e.g., "LICENSE" and "poetry.lock", maybe even "README.md").
  3. Run lcc-select to choose the files you want to include in your context. You can look at .llm-code-context/scratch.json to see what files are currently selected. If you prefer, you can edit the scratch file directly, before the next step.
  4. Run lcc-gencontext to generate and copy the full text of all selected files, the folder structure diagram and the technical summary of the project (if available).
  5. Paste the context into the first message of your conversation with the LLM, or equivalently into a Claude project file.

Respond to LLM requests for files

  1. The LLM will request a list of files in a markdown block quote.
  2. Select the block and copy into the clipboard
  3. Run lcc-genfiles to copy the text context of the requested files into the clipboard (thus replacing it's original contents - the file list).
  4. Paste the file content list into the next user message in the chat.

Technical Summary

LLM Code Context supports an optional technical summary feature, although its utility is currently unclear. This feature allows you to include a markdown file that provides project-specific information that may not be easily inferred from the code alone. To use this feature:

  1. Create a markdown file in your .llm-code-context folder (e.g., .llm-code-context/tech-summary.md).
  2. In your .llm-code-context/config.json file, set the summary_file key to the name of your summary file:
    {
      "summary_file": "tech-summary.md"
    }
    

If the key is missing or null, no summary will be included in the context.

The summary can include information like architectural decisions, non-obvious performance considerations, or future plans. For example:

  • "We chose a microservices architecture to allow for independent scaling of components."
  • "The process_data() function uses custom caching to optimize repeated calls with similar inputs."
  • "The authentication system is slated for an overhaul in Q3 to implement OAuth2."

When you run lcc-gencontext, this summary will be included after the folder structure diagram in the generated context.

For an example of a technical summary, you can refer to the tech-summary.md file for this repository.

Project Structure

โ””โ”€โ”€ llm-code-context.py
    โ”œโ”€โ”€ .gitignore
    โ”œโ”€โ”€ .llm-code-context
    โ”‚   โ”œโ”€โ”€ .gitignore
    โ”‚   โ”œโ”€โ”€ config.json
    โ”‚   โ”œโ”€โ”€ custom-prompt.md
    โ”‚   โ”œโ”€โ”€ tech-summary.md
    โ”‚   โ””โ”€โ”€ templates
    โ”‚       โ”œโ”€โ”€ full-context.j2
    โ”‚       โ””โ”€โ”€ sel-file-contents.j2
    โ”œโ”€โ”€ LICENSE
    โ”œโ”€โ”€ MANIFEST.in
    โ”œโ”€โ”€ README.md
    โ”œโ”€โ”€ poetry.lock
    โ”œโ”€โ”€ pyproject.toml
    โ”œโ”€โ”€ src
    โ”‚   โ””โ”€โ”€ llm_code_context
    โ”‚       โ”œโ”€โ”€ __init__.py
    โ”‚       โ”œโ”€โ”€ config_manager.py
    โ”‚       โ”œโ”€โ”€ context_generator.py
    โ”‚       โ”œโ”€โ”€ file_selector.py
    โ”‚       โ”œโ”€โ”€ folder_structure_diagram.py
    โ”‚       โ”œโ”€โ”€ git_ignorer.py
    โ”‚       โ”œโ”€โ”€ path_converter.py
    โ”‚       โ”œโ”€โ”€ pathspec_ignorer.py
    โ”‚       โ”œโ”€โ”€ template.py
    โ”‚       โ””โ”€โ”€ templates
    โ”‚           โ”œโ”€โ”€ full-context.j2
    โ”‚           โ””โ”€โ”€ sel-file-contents.j2
    โ””โ”€โ”€ tests
        โ”œโ”€โ”€ test_path_converter.py
        โ””โ”€โ”€ test_pathspec_ignorer.py

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_code_context-0.0.6.tar.gz (14.0 kB view details)

Uploaded Source

Built Distribution

llm_code_context-0.0.6-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file llm_code_context-0.0.6.tar.gz.

File metadata

  • Download URL: llm_code_context-0.0.6.tar.gz
  • Upload date:
  • Size: 14.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.4 Darwin/23.6.0

File hashes

Hashes for llm_code_context-0.0.6.tar.gz
Algorithm Hash digest
SHA256 d001cf845df92bafe899d429bd072699fa7262fc2850e483c6d07a847e8d2287
MD5 43c2475df4d3f7244304b8158447c884
BLAKE2b-256 2465f75324d63c4b25f612653651fd7dedf59009742115dca85d7e38fb4d4503

See more details on using hashes here.

File details

Details for the file llm_code_context-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_code_context-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 7ef3cdf45099b50e09af0ca423441bf4dc9a35b0c50eb0f71b8dab5ba6477568
MD5 ad7d03f6ac4550e1e56404368a721931
BLAKE2b-256 1f97682825ffac2dad402d767345e8d1989d73a8de2404f1c3a920bb6d306032

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page