Skip to main content

Chunking components for the Sayou Data Platform

Project description

sayou-chunking

Build Status License: Apache 2.0 Docs

sayou-chunking is a context-aware text splitting library for Python. It transforms raw text documents into Knowledge Graph-ready nodes, focusing on preserving semantic structure, hierarchy, and context.

This library is the "Assembler Preparation" component of the Sayou Data Platform. It sits between data cleaning (sayou-refinery) and knowledge graph construction (sayou-assembler), ensuring that RAG pipelines operate on logical, structured units of information rather than fragmented text.

Philosophy

sayou-chunking believes that "How you split determines how you retrieve." Naive chunking destroys context (e.g., splitting a table in half). We prioritize Structure-First Splitting:

  1. Atomic Protection: Never split atomic blocks like Tables or Code Snippets.
  2. Hierarchical Binding: Headers are parents; contents are children. We maintain parent_id linkages for KG construction.
  3. Composite Strategies: Combine multiple splitting strategies (e.g., Structure for context, Recursive for retrieval units).

🚀 Key Features

  • 4-Tier Architecture: Highly extensible design (Engine -> Interface -> Template -> Plugin -> Composite).
  • Atomic Protection: Built-in TextSegmenter engine prevents breaking Markdown tables and code blocks.
  • KG-Ready Output: Automatically generates parent_id, doc_level, and semantic_type metadata.
  • Smart Plugins:
    • MarkdownPlugin: Anchors chunks to Headers (#) and classifies content (Table, List, H1...).
    • ParentDocument: Implements "Small-to-Big" retrieval strategy using composite splitters.
  • Semantic Awareness: Detects topic shifts to create logically grouped chunks (Tier 2 Template).

📦 Installation

pip install sayou-chunking

⚡ Quickstart

The ChunkingPipeline orchestrates the splitting process. You can register any combination of Tier 2 Templates or Tier 3 Plugins.

Here is a complete example demonstrating how to process a Markdown file generated by sayou-refinery.

import os
import json
from typing import List

# 1. Import Core Pipeline & Interface
from sayou.chunking.pipeline import ChunkingPipeline
from sayou.chunking.interfaces.base_splitter import BaseSplitter

# 2. Import Splitters (Templates & Plugins)
from sayou.chunking.splitter.fixed_length import FixedLengthSplitter
from sayou.chunking.splitter.recursive import RecursiveSplitter
from sayou.chunking.splitter.structure import StructureSplitter
from sayou.chunking.splitter.semantic import SemanticSplitter
from sayou.chunking.splitter.parent_document import ParentDocumentSplitter
from sayou.chunking.plugins.markdown_plugin import MarkdownPlugin

def run_chunking_demo():
    # Setup File Paths
    refinery_output_md = os.path.join(".", "test.md")

    with open(REFINERY_OUTPUT_MD, "r", encoding="utf-8") as f:
        markdown_content = f.read()

    source_metadata = {
        "source_file": refinery_output_md, 
        "id": "doc_refinery_output"
    }

    # 3. Register Splitters
    default_splitters: List[BaseSplitter] = [
        FixedLengthSplitter(),
        ParentDocumentSplitter(),
        RecursiveSplitter(),
        SemanticSplitter(),
        StructureSplitter(),
        MarkdownPlugin(),
    ]

    # 4. Initialize Pipeline
    pipeline = ChunkingPipeline(splitters=default_splitters)
    pipeline.initialize()

    # 5. Create Split Request
    # We use 'markdown' type to leverage structure-aware splitting
    split_request = {
        "type": "markdown",
        "content": markdown_content,
        "metadata": source_metadata,
        "chunk_size": 1000,
        "chunk_overlap": 50
    }

    print(f"--- [Sayou Chunking Demo] ---")
    print(f"Splitting using '{split_request['type']}'...")

    try:
        # 6. Run Splitting
        chunks = pipeline.split(split_request)
        print(f"✅ Successfully split content into {len(chunks)} chunks.\n")

        # 7. Save Output
        output_dir = os.path.join(os.path.dirname(__file__), "output")
        os.makedirs(output_dir, exist_ok=True)
        
        output_path = os.path.join(output_dir, f"chunks_output.json")
        with open(output_path, "w", encoding="utf-8") as f:
            json.dump(chunks, f, indent=2, ensure_ascii=False)
        
        print(f"Full output saved to {output_path}")

    except Exception as e:
        print(f"❌ Chunking failed: {e}")

if __name__ == "__main__":
    run_chunking_demo()

Example JSON Output (KG-Ready)

Notice how semantic_type is identified and parent_id links the content to its header.

[
    {
    "chunk_content": "# 1. Introduction",
    "metadata": {
        "chunk_id": "doc_123_h_0",
        "semantic_type": "h1",
        "is_header": true,
        "part_index": 0
    }
    },
    {
    "chunk_content": "| Feature | Status |\n|---|---|\n| Protect | Done |...",
    "metadata": {
        "chunk_id": "doc_123_part_1",
        "semantic_type": "table",
        "parent_id": "doc_123_h_0", 
        "section_title": "1. Introduction",
        "part_index": 1
    }
    }
]

🗺️ Roadmap (v0.1.0+)

sayou-chunking v0.0.1 establishes the structural foundation.

  • HTML Plugin: Applying the "Parent-Child" strategy to HTML DOM trees.
  • Real Semantic Engine: Integrating OpenAI/HuggingFace embeddings into SemanticSplitter.
  • Tokenizer Support: Switching chunk_size calculation from characters to tokens (e.g., tiktoken).

🤝 Contributing

We welcome contributions! Whether it's a new Tier 3 Plugin for a specific format or optimization of the Tier 1 Engine. Please check our contributing guidelines.

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_chunking-0.1.3.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_chunking-0.1.3-py3-none-any.whl (20.7 kB view details)

Uploaded Python 3

File details

Details for the file sayou_chunking-0.1.3.tar.gz.

File metadata

  • Download URL: sayou_chunking-0.1.3.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.1.3.tar.gz
Algorithm Hash digest
SHA256 a764c8db9bb540b0f09bafe90999190a0bb39af3b448f9a50af8b64d70a2b40e
MD5 f2a62a7d30672a94f021a739f78f1890
BLAKE2b-256 6756497afac29f03e069859a5fcb34d58a161a74e188e2cd70e01cc09a339ae7

See more details on using hashes here.

File details

Details for the file sayou_chunking-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: sayou_chunking-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 20.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 566dee8671a8c1fc121db34b0c201a2bcbf388b29c80d05b4fb980922807d9c4
MD5 7217dc78a9ce90760613a152d5a1e1bd
BLAKE2b-256 35a8e823a9ba158677a28a66a6db3f06862975517b9dc5523e1df23699dbcf78

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page