Skip to main content

Chunking components for the Sayou Data Platform

Project description

sayou-chunking

PyPI version License Docs

The Intelligent Text Splitter for Sayou Fabric.

sayou-chunking splits large texts into smaller, semantically meaningful units called Chunks. This is a critical step for RAG (Retrieval-Augmented Generation) systems, as it directly impacts retrieval accuracy.

It goes beyond simple character splitting by offering structure-aware, semantic, and hierarchical chunking strategies.

💡 Core Philosophy

"Context is King."

Blindly cutting text at 500 characters breaks sentences and loses meaning. sayou-chunking aims to preserve context by:

  1. Structure Awareness: Respects document headers, tables, and code blocks (especially in Markdown).
  2. Semantic Coherence: Groups sentences that belong to the same topic using similarity metrics.
  3. Hierarchy: Maintains Parent-Child relationships to retrieve small precise chunks while providing large context to the LLM.

📦 Installation

pip install sayou-chunking

⚡ Quick Start

The ChunkingPipeline provides a unified interface for various splitting strategies.

from sayou.chunking.pipeline import ChunkingPipeline

def run_demo():
    # 1. Initialize Pipeline
    pipeline = ChunkingPipeline()
    pipeline.initialize()

    # 2. Prepare Input (e.g., from Refinery)
    text_content = """
    # Section 1: Introduction
    Chunking is the process of breaking text down.
    
    ## Benefits
    - Better Retrieval
    - Context Preservation
    """
    
    request = {
        "content": text_content,
        "metadata": {"source": "doc.md"},
        "config": {"chunk_size": 50}
    }

    # 3. Run with Strategy ('markdown', 'recursive', 'semantic', etc.)
    chunks = pipeline.run(request, strategy="markdown")

    # 4. Result
    for i, chunk in enumerate(chunks):
        print(f"[{i}] Type: {chunk.metadata.get('semantic_type')}")
        print(f"    Content: {chunk.content}")

if __name__ == "__main__":
    run_demo()

🔑 Key Components

Splitter

  • RecursiveSplitter: The standard strategy. Splits by paragraph -> line -> sentence -> word to keep related text together.
  • MarkdownSplitter: Aware of Markdown syntax. Splits by headers (#) first, protecting tables and code blocks.
  • FixedLengthSplitter: Hard split by character count. Useful when strict token limits are required.
  • StructureSplitter: Splits based on user-defined regex patterns (e.g., "Article \d+").
  • SemanticSplitter: Uses cosine similarity between sentences to find topic breakpoints.
  • ParentDocumentSplitter: Creates large "Parent" chunks for context and small "Child" chunks for retrieval, linking them together.

🤝 Contributing

We welcome contributions for New Strategies (e.g., CodeSplitter for Python/JS) or Integrations with other embedding models for Semantic Splitting.

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_chunking-0.3.1.tar.gz (22.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_chunking-0.3.1-py3-none-any.whl (24.7 kB view details)

Uploaded Python 3

File details

Details for the file sayou_chunking-0.3.1.tar.gz.

File metadata

  • Download URL: sayou_chunking-0.3.1.tar.gz
  • Upload date:
  • Size: 22.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.3.1.tar.gz
Algorithm Hash digest
SHA256 2e1c5a0b385978171d2a16a469c376b084e90b6241bca1102348e185bd845717
MD5 aeb8af2e24a3bfdd7575900b282f8f78
BLAKE2b-256 508fde23228f2a1032c0facc78d73f220d961d6fbd46e41418904aafd64355bd

See more details on using hashes here.

File details

Details for the file sayou_chunking-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: sayou_chunking-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 24.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a37dcba467dd5ae109f37d208d0bdb7859ed2f650f8e10f830edc414794444da
MD5 33deba6648eda7db93781faddaac8d8f
BLAKE2b-256 fad13095b2f6bf612822681ed67e8749728673759e9706acd8c73b0ad8797797

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page