Skip to main content

Connector components for the Sayou Data Platform

Project description

sayou-connector

PyPI version License Docs

The Universal Data Ingestion Engine for Sayou Fabric.

sayou-connector provides a unified interface to fetch data from diverse sources—Local Files, Web URLs, and Databases—normalizing everything into a standard format called SayouPacket.

It separates the logic of Navigation (Generator) from Retrieval (Fetcher), enabling complex recursive crawling and pagination strategies out of the box.

💡 Core Philosophy

"Navigate First, Fetch Later."

Data collection is not just about downloading; it's about discovery. We decouple the responsibility into two roles:

  1. Generator (Navigator): The "Brain". It decides what to fetch next (e.g., calculates DB offsets, finds next page links) and yields a Task.
  2. Fetcher (Driver): The "Muscle". It executes the actual retrieval (e.g., HTTP GET, SQL Query) and returns a Packet.

This separation enables the Feedback Loop, where the result of a fetch (e.g., found links) feeds back into the Generator to discover more targets.

📦 Installation

pip install sayou-connector

⚡ Quick Start

The ConnectorPipeline manages the feedback loop between Generators and Fetchers.

from sayou.connector.pipeline import ConnectorPipeline

def run_demo():
    # 1. Initialize Pipeline
    pipeline = ConnectorPipeline()
    pipeline.initialize()

    # 2. Run (Example: Web Crawling)
    print("Starting Web Crawl...")
    
    # Returns an iterator of 'SayouPacket' objects
    packets = pipeline.run(
        source="https://news.daum.net/tech",
        strategy="requests",
        link_pattern=r"https://v\.daum\.net/v/\d+",
        max_depth=1
    )

    # 3. Process Results (Stream)
    for packet in packets:
        if packet.success:
            print(f"[Fetched] {packet.task.uri}")
            # packet.data contains the extracted content (dict, bytes, etc.)
            print(f"   Data: {str(packet.data)[:50]}...")
        else:
            print(f"[Error] {packet.error}")

if __name__ == "__main__":
    run_demo()

🔑 Key Concepts

Generators

  • FileGenerator: Recursively scans directories to find files matching extensions or patterns.
  • SqlGenerator: Generates paginated SQL queries (LIMIT/OFFSET) to fetch large tables in batches.
  • WebCrawlGenerator: Manages a URL frontier queue for BFS/DFS web crawling with depth control.

Fetchers

  • FileFetcher: Reads binary or text content from the local file system.
  • SqliteFetcher: Executes SQL queries against SQLite databases securely.
  • SimpleWebFetcher: Fetches HTML pages and extracts data/links using BeautifulSoup.

🤝 Contributing

We welcome contributions for new Fetchers (e.g., S3Fetcher, KafkaFetcher) or Generators (e.g., SitemapGenerator)!

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_connector-0.3.21.tar.gz (41.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_connector-0.3.21-py3-none-any.whl (60.1 kB view details)

Uploaded Python 3

File details

Details for the file sayou_connector-0.3.21.tar.gz.

File metadata

  • Download URL: sayou_connector-0.3.21.tar.gz
  • Upload date:
  • Size: 41.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_connector-0.3.21.tar.gz
Algorithm Hash digest
SHA256 b322992bf14c6f3d94dcde486e39066aae0dee5c35bef8be597fde14a7431be4
MD5 2918e14f815ff0636a6d99f561867b5c
BLAKE2b-256 67277cb1e08c72fdd9a2e71280c97bc3582e7de083ed9dad9fb119495c2f9da1

See more details on using hashes here.

File details

Details for the file sayou_connector-0.3.21-py3-none-any.whl.

File metadata

File hashes

Hashes for sayou_connector-0.3.21-py3-none-any.whl
Algorithm Hash digest
SHA256 780c0115fa3dc38ea9be7330fa68d5c21b4b5f7d566675b301007199de280955
MD5 e049029bbcb1b72ac13f32070264d169
BLAKE2b-256 8102d6ba7416b43efe74d93d20a24882b5b870fb4cd87e62ccdc7fa0917c69ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page