Skip to main content

Chunking components for the Sayou Data Platform

Project description

sayou-chunking

PyPI version License Docs

The Intelligent Text Splitter for Sayou Fabric.

sayou-chunking splits large texts into smaller, semantically meaningful units called Chunks. This is a critical step for RAG (Retrieval-Augmented Generation) systems, as it directly impacts retrieval accuracy.

It goes beyond simple character splitting by offering structure-aware, semantic, and hierarchical chunking strategies.

💡 Core Philosophy

"Context is King."

Blindly cutting text at 500 characters breaks sentences and loses meaning. sayou-chunking aims to preserve context by:

  1. Structure Awareness: Respects document headers, tables, and code blocks (especially in Markdown).
  2. Semantic Coherence: Groups sentences that belong to the same topic using similarity metrics.
  3. Hierarchy: Maintains Parent-Child relationships to retrieve small precise chunks while providing large context to the LLM.

📦 Installation

pip install sayou-chunking

⚡ Quick Start

The ChunkingPipeline provides a unified interface for various splitting strategies.

from sayou.chunking.pipeline import ChunkingPipeline

def run_demo():
    # 1. Initialize Pipeline
    pipeline = ChunkingPipeline()
    pipeline.initialize()

    # 2. Prepare Input (e.g., from Refinery)
    text_content = """
    # Section 1: Introduction
    Chunking is the process of breaking text down.
    
    ## Benefits
    - Better Retrieval
    - Context Preservation
    """
    
    request = {
        "content": text_content,
        "metadata": {"source": "doc.md"},
        "config": {"chunk_size": 50}
    }

    # 3. Run with Strategy ('markdown', 'recursive', 'semantic', etc.)
    chunks = pipeline.run(request, strategy="markdown")

    # 4. Result
    for i, chunk in enumerate(chunks):
        print(f"[{i}] Type: {chunk.metadata.get('semantic_type')}")
        print(f"    Content: {chunk.content}")

if __name__ == "__main__":
    run_demo()

🔑 Key Components

Splitter

  • RecursiveSplitter: The standard strategy. Splits by paragraph -> line -> sentence -> word to keep related text together.
  • MarkdownSplitter: Aware of Markdown syntax. Splits by headers (#) first, protecting tables and code blocks.
  • FixedLengthSplitter: Hard split by character count. Useful when strict token limits are required.
  • StructureSplitter: Splits based on user-defined regex patterns (e.g., "Article \d+").
  • SemanticSplitter: Uses cosine similarity between sentences to find topic breakpoints.
  • ParentDocumentSplitter: Creates large "Parent" chunks for context and small "Child" chunks for retrieval, linking them together.

🤝 Contributing

We welcome contributions for New Strategies (e.g., CodeSplitter for Python/JS) or Integrations with other embedding models for Semantic Splitting.

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_chunking-0.3.2.tar.gz (22.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_chunking-0.3.2-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file sayou_chunking-0.3.2.tar.gz.

File metadata

  • Download URL: sayou_chunking-0.3.2.tar.gz
  • Upload date:
  • Size: 22.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.3.2.tar.gz
Algorithm Hash digest
SHA256 2564a6dcb745ab84efff600cca094018f1622821c11b236478f36f4ba776214e
MD5 7cc9012944c32a440b7f15ba40dd8217
BLAKE2b-256 ffc05de4973b6119a8bc0c02812e35a5b65c16762c70dfde51785c0880572fcf

See more details on using hashes here.

File details

Details for the file sayou_chunking-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: sayou_chunking-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 600332fcd12087284c3baf94757b63a6ea1d195b95dc64c0fbe288c66b1209f0
MD5 9340ec5c3b8fc852495ff5503b2722cd
BLAKE2b-256 4a52605ee119cc451afd38dc68da375610159a1cca45b26e06ee18370fd0eec6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page