Skip to main content

Library for building blockchain pipelines

Project description

Tiders

Documentation PyPI tiders-core tiders-rpc-client telegram

Tiders is an open-source framework that simplifies getting data out of blockchains and into your favorite tools. Whether you are building a DeFi dashboard, tracking NFT transfers, or running complex analytics, Tiders handles the heavy lifting of fetching, cleaning, transforming and storing blockchain data.

Tiders is modular. A Tiders pipeline is built from four components:

four_pipeline_components
Component Description
Provider Data source (HyperSync, SQD, or RPC)
Query What data to fetch (block range, transaction, logs, filters, field selection)
Steps Transformations to apply (decode, cast, encode, custom)
Writer Output destination

Why Tiders?

Most indexers lock you into a specific platform or database. Tiders is built to be modular, meaning you can swap parts in and out without breaking your setup:

  • Swap Providers: Don't like your current data source? Switch between HyperSync, SQD, or a standard RPC node by changing one line of code.
  • Plug-and-Play data transformations: Need to decode smart contract events or change data types? Use our built-in Rust-powered steps or write your own custom logic.
  • Write Anywhere: Send your data to a local DuckDB file for prototyping, or a production-grade ClickHouse or PostgreSQL instance when you're ready to scale.
  • Modular Reusable Pipelines: Protocols often reuse the same data structures. You don't need write modules from scratch every time. Since Tiders pipelines are regular Python objects, you can build functions around them, reuse across pipelines, or set input parameters to customize as needed.

Getting Started

See getting started section of the docs.

Two ways to use tiders

Mode How When to use
Python SDK Write a Python script, import tiders Full control, custom logic, complex pipelines
CLI (No-Code) Write a YAML config, run tiders start Quick setup, no Python required, standard pipelines

Both modes share the same pipeline engine.

You can also use tiders codegen to generate a Python script from a YAML config — a quick way to move from no-code to full Python control.

Installation

For both CLI and SDK

pip install tiders

Features

  • Continuous Ingestion: Keep your datasets live and fresh. Tiders can poll the chain head to ensure your data is always up to date.
  • Switch Providers: Move between HyperSync, SQD, or standard RPC nodes with a single config change.
  • No Vendor Lock-in: Use the best data providers in the industry without being tied to their specific platforms or database formats.
  • Custom Logic: Easily extend and customize your pipeline code in Python for complete flexibility.
  • Advanced Analytics: Seamlessly works with industry-standard tools like Polars, Pandas, Data fusion and PyArrow as the data is fetched.
  • Multiple Outputs: Send the same data to a local file and a production database simultaneously.
  • Rust-Powered Speed: Core tasks like decoding and transforming data are handled in Rust, giving you massive performance without needing to learn a low-level language.
  • Parallel Execution: Tiders doesn't wait around. While it's writing the last batch of data to your database, it’s already fetching and processing the next one in the background.

Data Providers

Connect to the best data sources in the industry without vendor lock-in. Tiders decouples the provider from the destination, giving you a consistent way to fetch data.

Provider Ethereum (EVM) Solana (SVM)
HyperSync
SQD
RPC

Tiders can support new providers. If your project has custom APIs to fetch blockchain data, especially ones that support server-side filtering, you can create a client for it, similar to the Tiders RPC client. Get in touch with us.

Transformations

Leverage the tools you already know. Tiders automatically convert data batch-by-batch into your engine's native format, allowing for seamless, custom transformations on every incoming increment immediately before it is written.

Engine Data format in your function Best for
Polars Dict[str, pl.DataFrame] Fast columnar operations, expressive API
Pandas Dict[str, pd.DataFrame] Familiar API, complex row-level operations
DataFusion Dict[str, datafusion.DataFrame] SQL-based transformations, lazy evaluation
PyArrow Dict[str, pa.Table] Zero-copy, direct Arrow manipulation

Supported Output Formats

Whether local or a production-grade data lake, Tiders handles the schema mapping and batch-loading to your destination of choice.

Destination Type Description
DuckDB Database Embedded analytical database, great for local exploration and prototyping
ClickHouse Database Column-oriented database optimized for real-time analytical queries
PostgreSQL Database General-purpose relational database with broad ecosystem support
Apache Iceberg Table Format Open table format for large-scale analytics on data lakes
Delta Lake Table Format Storage layer with ACID transactions for data lakes
Parquet File Columnar file format, efficient for analytical workloads
CSV File Plain-text format, widely compatible and easy to inspect

Examples

Example Chain Provider Decoding Writer
rETH Transfer (no code) Ethereum (EVM) HyperSync EVM event decode Parquet
Jupiter Swaps Solana (SVM) SQD SVM instruction decode DuckDB
Uniswap V3 Ethereum (EVM) HyperSync / SQD / RPC EVM event decode (factory + children) DuckDB / Parquet / Delta Lake / ClickHouse / Iceberg
  • rETH Transfer — Simplest starting point. Uses a YAML config with no Python code to index a single event from a single contract.
  • Jupiter Swaps — Uses the Python SDK on Solana. Shows instruction decoding and custom Polars steps.
  • Uniswap V3 — Demonstrates the factory + children two-stage indexing pattern, chaining two pipelines where the first discovers contracts and the second indexes their events.

Browse all examples in the examples/ directory.

Logging

Python code uses the standard logging module of python, so it can be configured according to python docs.

Set RUST_LOG environment variable according to env_logger docs in order to see logs from rust modules.

To run an example with trace level logging for rust modules:

RUST_LOG=trace uv run examples/path/to/my/example

Development

Tiders is composed of some repositories. 3 owned ones.

Tiders-architecture

This repo uses uv for development. Clone all three projects side by side:

git clone https://github.com/yulesa/tiders.git
git clone https://github.com/yulesa/tiders-core.git
git clone https://github.com/yulesa/tiders-rpc-client.git

Local development with tiders-core

Configure tiders to use your local tiders-core Python package by adding to pyproject.toml:

[tool.uv.sources]
tiders-core = { path = "../tiders-core/python", editable = true }

Then sync the environment:

cd tiders
uv sync

For full instructions including building tiders-core and tiders-rpc-client from source, see the Development Setup docs.

Core libraries we use for ingesting/decoding/validating/transforming blockchain data are implemented in tiders-core repo.

Acknowledgements

Tiders is a fork of Cherry and cherry-core, a blockchain data pipeline framework built by the SteelCake team. Cherry laid the architectural foundation that Tiders builds upon, and we're grateful for their work and the open-source spirit that made this continuation possible.

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tiders-0.1.3.tar.gz (362.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tiders-0.1.3-py3-none-any.whl (90.2 kB view details)

Uploaded Python 3

File details

Details for the file tiders-0.1.3.tar.gz.

File metadata

  • Download URL: tiders-0.1.3.tar.gz
  • Upload date:
  • Size: 362.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for tiders-0.1.3.tar.gz
Algorithm Hash digest
SHA256 275a117be44558195dc70c073e4fd7e02e1c059b25ebb8f27fcd91be5b890a2a
MD5 52de7fc36de298d5f1bedbd7ef2d2a35
BLAKE2b-256 099f4512a2a5d9648ef88634098085929290d3163f21847b233d862fd8e96cdb

See more details on using hashes here.

File details

Details for the file tiders-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: tiders-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 90.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for tiders-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5a402050a2ba0b478e3b81076edee1a36860e80d72f44accf14d09385f7529b7
MD5 0113d2907822ea7228a4ae3c344c8e4d
BLAKE2b-256 30d8b68d35c9ae1b06606a22cd677970e3c8f5fc5a8ec684ee0ee5cbc3b88ca2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page