Skip to main content

Chunking components for the Sayou Data Platform

Project description

sayou-chunking

PyPI version License Docs

The Intelligent Text Splitter for Sayou Fabric.

sayou-chunking splits large texts into smaller, semantically meaningful units called Chunks. This is a critical step for RAG (Retrieval-Augmented Generation) systems, as it directly impacts retrieval accuracy.

It goes beyond simple character splitting by offering structure-aware, semantic, and hierarchical chunking strategies.

💡 Core Philosophy

"Context is King."

Blindly cutting text at 500 characters breaks sentences and loses meaning. sayou-chunking aims to preserve context by:

  1. Structure Awareness: Respects document headers, tables, and code blocks (especially in Markdown).
  2. Semantic Coherence: Groups sentences that belong to the same topic using similarity metrics.
  3. Hierarchy: Maintains Parent-Child relationships to retrieve small precise chunks while providing large context to the LLM.

📦 Installation

pip install sayou-chunking

⚡ Quick Start

The ChunkingPipeline provides a unified interface for various splitting strategies.

from sayou.chunking.pipeline import ChunkingPipeline

def run_demo():
    # 1. Initialize Pipeline
    pipeline = ChunkingPipeline()
    pipeline.initialize()

    # 2. Prepare Input (e.g., from Refinery)
    text_content = """
    # Section 1: Introduction
    Chunking is the process of breaking text down.
    
    ## Benefits
    - Better Retrieval
    - Context Preservation
    """
    
    request = {
        "content": text_content,
        "metadata": {"source": "doc.md"},
        "config": {"chunk_size": 50}
    }

    # 3. Run with Strategy ('markdown', 'recursive', 'semantic', etc.)
    chunks = pipeline.run(request, strategy="markdown")

    # 4. Result
    for i, chunk in enumerate(chunks):
        print(f"[{i}] Type: {chunk.metadata.get('semantic_type')}")
        print(f"    Content: {chunk.content}")

if __name__ == "__main__":
    run_demo()

🔑 Key Components

Splitter

  • RecursiveSplitter: The standard strategy. Splits by paragraph -> line -> sentence -> word to keep related text together.
  • MarkdownSplitter: Aware of Markdown syntax. Splits by headers (#) first, protecting tables and code blocks.
  • FixedLengthSplitter: Hard split by character count. Useful when strict token limits are required.
  • StructureSplitter: Splits based on user-defined regex patterns (e.g., "Article \d+").
  • SemanticSplitter: Uses cosine similarity between sentences to find topic breakpoints.
  • ParentDocumentSplitter: Creates large "Parent" chunks for context and small "Child" chunks for retrieval, linking them together.

🤝 Contributing

We welcome contributions for New Strategies (e.g., CodeSplitter for Python/JS) or Integrations with other embedding models for Semantic Splitting.

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_chunking-0.2.2.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_chunking-0.2.2-py3-none-any.whl (22.7 kB view details)

Uploaded Python 3

File details

Details for the file sayou_chunking-0.2.2.tar.gz.

File metadata

  • Download URL: sayou_chunking-0.2.2.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.2.2.tar.gz
Algorithm Hash digest
SHA256 183c6baf56e0f0de58d4393e503c155dca5079d31f2ca478783765562e88900f
MD5 979f3300ea515ad073791947ea13f54a
BLAKE2b-256 a361c79cbb1f05fd787f513c42f2cbff7eedc28306af3795daa8684d7a1fb52d

See more details on using hashes here.

File details

Details for the file sayou_chunking-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: sayou_chunking-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_chunking-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d173333b299e10f150d685733344c49b73b22e54e73dd187cb59a44e43ca5101
MD5 6b182a70a0c0fe8a590047d3935abb7d
BLAKE2b-256 ad076ecdd52fe1a443f66d3f9e2b69e092c9c4ad2dfca47aadab3bc313d6fdae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page