Skip to main content

Auto chunking tuner and MCP server for RAG pipelines

Project description

chunktuner

PyPI version Python versions License: MIT CI Docs

Auto chunking tuner and MCP server for RAG pipelines.

Give it your documents. It tries multiple chunking strategies, measures which one lets an AI answer questions most accurately, and tells you the winner.

chunktuner project flow: documents through strategies, evaluation, to a recommended configuration


What it does

When building a RAG pipeline, how you split documents into chunks directly impacts retrieval quality. chunktuner automates the process of finding the optimal chunking strategy for your specific corpus, embedding model, and use case.

It benchmarks strategies like fixed-token windows, recursive character splitting, semantic splitting, PDF structural chunking, and AST-based code chunking — then scores each one against real retrieval metrics (token recall, MRR, NDCG) and optional generation metrics (RAGAS faithfulness, answer relevancy).


Interfaces

  • Python library — programmatic integration into your pipeline
  • CLI (chunk-tune) — human-driven tuning from the terminal
  • MCP server — use directly from Claude Desktop or any MCP host

Quickstart

# Install
uv tool install chunktuner

# Initialize workspace
chunk-tune init --provider openai

# See cost estimate before running anything
chunk-tune estimate ./my_docs --use-case rag_qa

# Get a recommendation
chunk-tune recommend ./my_docs --use-case rag_qa

Python API:

from pathlib import Path
from chunktuner import FileIngestor, LiteLLMEmbeddingFunction, AutoTuner
from chunktuner import default_registry, Evaluator, ScoreCalculator

docs = FileIngestor().ingest_dir(Path("./my_docs"))
embedding_fn = LiteLLMEmbeddingFunction("text-embedding-3-small")
tuner = AutoTuner(
    strategies=default_registry,
    evaluator=Evaluator(embedding_fn),
    scorer=ScoreCalculator(use_case="rag_qa"),
)
result = tuner.recommend(docs, use_case="rag_qa")
print(result.best.config)

Supported strategies

Strategy Best for
fixed_tokens Baseline; uniform token windows
recursive_character General prose and documentation
semantic Theme-heavy articles
markdown_semantic Structured Markdown docs
pdf_structural PDFs with layout regions and tables
structural_semantic PDF/DOCX with mixed layout and text
late_chunking Long docs with dense cross-references
agentic High-value narrative documents
code_ast Code repos (Python, JavaScript)
code_window Code baseline (sliding window)

MCP server (Claude Desktop)

Python FastMCP (chunk-tune-mcp, stdio). No Node.js build. See docs/mcp_setup.md.

Add to your .mcp.json:

{
  "mcpServers": {
    "chunktuner": {
      "command": "uvx",
      "args": ["--from", "chunktuner[mcp]", "chunk-tune-mcp"],
      "env": {
        "CHUNK_TUNER_BASE_DIR": "/path/to/your/corpus"
      }
    }
  }
}

Tools available: list_strategies, preview_chunks, evaluate_chunking, recommend_config.


CLI reference

chunk-tune init       Bootstrap workspace config
chunk-tune analyze    Quick structural scan (no API cost)
chunk-tune estimate   Dry-run cost/token estimate
chunk-tune evaluate   Full evaluation across strategies
chunk-tune recommend  Evaluation + best config recommendation
chunk-tune compare    Side-by-side comparison of specific strategies
chunk-tune preview    Inspect how a strategy splits a document
chunk-tune cache      Manage embedding and chunk cache

Installation options

uv add chunktuner                    # library
uv tool install chunktuner           # global CLI
uvx chunktuner                       # ephemeral, no install

# With optional extras
uv add "chunktuner[docling]"         # PDF/DOCX support
uv add "chunktuner[ragas]"           # generation metrics
uv add "chunktuner[semantic]"        # semantic chunking
uv add "chunktuner[code]"            # AST code chunking
uv add "chunktuner[all]"             # everything

Contributing

See CONTRIBUTING.md.


👨🏻‍💻 Author

Shantanu Deshmukh

Full stack developer with experience in building E2E AI applications.

Linkedin / Twitter / AngelList

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chunktuner-0.1.1.tar.gz (570.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chunktuner-0.1.1-py3-none-any.whl (71.4 kB view details)

Uploaded Python 3

File details

Details for the file chunktuner-0.1.1.tar.gz.

File metadata

  • Download URL: chunktuner-0.1.1.tar.gz
  • Upload date:
  • Size: 570.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for chunktuner-0.1.1.tar.gz
Algorithm Hash digest
SHA256 018581e93cf686aadd29f18900bdfd44f2340e9f62951b69c83ceec088dbcd28
MD5 8bbb24603d88f7db21773258ac3628ac
BLAKE2b-256 baac490a5c3eda6e36a6fca005fb2d7a9867a81247f810f63fba17f4ab956381

See more details on using hashes here.

File details

Details for the file chunktuner-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: chunktuner-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 71.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for chunktuner-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4f8373cb5c558d11f1cce7746c67a8a6cae687da453fed837a9d687f9b247aeb
MD5 847157173cb7260c0811826a066f2d64
BLAKE2b-256 2113d9c0c6a541816c26abea9439b6ae288f631a844b75adee5115d95dfb7ae0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page