Skip to main content

LLM plugin to load entire folder contents as fragments

Project description

llm-fragments-folder

An LLM plugin that loads entire folder contents as fragments, turning any directory into a chat-ready knowledge base.

Installation

llm install llm-fragments-folder

Or install from source:

cd llm-fragments-folder
pip install -e .

Usage

Two fragment loaders are provided: folder: for general document collections and project: for software projects.

folder: - Load documents from a directory

# Chat against all docs in a folder
llm chat -f folder:./docs

# Ask a question about files in the current directory
llm -f folder:. "What are these documents about?"

# Combine with a specific model
llm -f folder:~/notes -m claude-sonnet-4-5 "Find all action items"

# Use with system fragments for custom instructions
llm -f folder:./research --sf "You are a research assistant" "Summarize the key findings"

# Only load specific file types
llm -f "folder:./docs?ext=md,txt" "Summarize the docs"
llm -f "folder:.?ext=json,yaml" "Explain these configs"

project: - Load a software project (respects .gitignore)

# Explain a codebase
llm chat -f project:.

# Ask about a specific project
llm -f project:./my-app "What framework does this use?"

# Code review
llm -f project:. "Review this code for security issues"

# Architecture overview
llm -f project:~/repos/my-api -m claude-sonnet-4-5 "Describe the architecture"

# Only Python files
llm -f "project:.?ext=py" "Review this code"

The project: loader:

  • Uses git ls-files when inside a git repo (most accurate)
  • Falls back to parsing .gitignore patterns if git is not available
  • Prepends a file tree summary as the first fragment
  • Automatically skips node_modules, __pycache__, .git, venv, dist, build, etc.

Combining with other fragments

Fragments compose naturally with each other and with LLM's other features:

# Folder + URL context
llm -f folder:./docs -f https://example.com/api-spec "Compare our docs to the spec"

# Folder + system prompt
llm -f folder:./meeting-notes --system "Extract action items with owners and dates" ""

# Project + GitHub issue
llm install llm-fragments-github
llm -f project:. -f issue:user/repo/42 "Implement this feature"

What gets loaded

Text file detection is based on file extension and filename. Supported types include:

  • Documents: .md, .qmd, .txt, .rst, .adoc, .tex, .org
  • Code: .py, .js, .ts, .go, .rs, .java, .rb, .c, .cpp, and many more
  • Config: .json, .yaml, .yml, .toml, .ini, .env, .cfg
  • Web: .html, .css, .scss, .svg, .xml
  • Data: .csv, .tsv, .sql, .graphql
  • Dotfiles: .bashrc, .zshrc, .vimrc, .gitconfig, .tmux.conf, .profile, .npmrc, etc.
  • Special files: Makefile, Dockerfile, LICENSE, etc.
  • Shebang scripts: extensionless files starting with #!

Always skipped directories: .git, node_modules, __pycache__, .venv, venv, dist, build, .idea, .vscode, .mypy_cache, .pytest_cache, etc.

Filtering by extension

Use ?ext= to load only specific file types. This bypasses the default text file detection and only includes files matching the given extensions:

llm -f "folder:./src?ext=py,js,ts" "Review this code"
llm -f "project:.?ext=md,txt" "Summarize the documentation"
llm -f "folder:.?ext=csv,json" "Analyze this data"

Extensions can be specified with or without the leading dot (md and .md both work). Multiple extensions are comma-separated.

Dotfiles

Use ?ext=dotfiles to grab all dotfiles (.bashrc, .gitconfig, .vimrc, etc.) from a folder:

# Load all dotfiles
llm -f "folder:~?ext=dotfiles" "Explain my shell config"

# Combine dotfiles with other extensions
llm -f "folder:~?ext=dotfiles,md" "Summarize my config and docs"

# Target a specific dotfile by name
llm -f "folder:~?ext=.bashrc,.zshrc" "Compare these shell configs"

Safety limits: Files larger than 1MB are skipped. Maximum 500 files per loader call.

How it works

Each file becomes a separate LLM fragment, wrapped with a filename header:

--- path/to/file.py ---
<file contents>

This means LLM's fragment deduplication works at the file level. If you reference the same folder across multiple prompts, files that haven't changed won't be stored again in the log database.

Development

# Clone and install for development
git clone https://github.com/michael-borck/llm-fragments-folder.git
cd llm-fragments-folder
uv sync

# Run tests
uv run pytest

# Lint and format
uv run ruff check .
uv run ruff format .

# Type checking
uv run mypy llm_fragments_folder.py

Acknowledgments

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_fragments_folder-0.2.0.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_fragments_folder-0.2.0-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file llm_fragments_folder-0.2.0.tar.gz.

File metadata

  • Download URL: llm_fragments_folder-0.2.0.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for llm_fragments_folder-0.2.0.tar.gz
Algorithm Hash digest
SHA256 6891c8c3e3c7186dfe06bca2ece402410d7c77167368c3827fc209d08a02d25d
MD5 b62c8f4cef846c44b8ae2b125d199bf6
BLAKE2b-256 74a95bfd71c9a26485a7b5c9212efb44d8c28b2448b997207198d4c7b7effa68

See more details on using hashes here.

File details

Details for the file llm_fragments_folder-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_fragments_folder-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a0cca0fcf03a95d25caf91799e76a4bbdfe2fd248a1a9c4f08770450a4e63b55
MD5 52250ab92debe9337823e9361ff72444
BLAKE2b-256 bde03458cca58f75a5ffe2330ba9641e4753fd4a201aa0cbc735e8f6308e6373

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page