Skip to main content

A flexible blockchain indexing and data processing pipeline

Project description

Cherry Event Indexer

A flexible blockchain event indexing and data processing pipeline.

Overview

Cherry Event Indexer is a modular system for:

  • Ingesting blockchain events and logs
  • Processing and transforming blockchain data
  • Writing data to various storage backends

Features

  • Modular Pipeline Architecture

    • Configurable data providers
    • Customizable processing steps
    • Pluggable storage backends
  • Built-in Steps

    • EVM block validation
    • Event decoding
    • Custom processing steps
  • Storage Options

    • Local Parquet files
    • AWS S3
    • More coming soon...

Project Structure:

cherry/
├── src/
│ ├── config/ # Configuration parsing
│ ├── utils/ # Pipeline and utilities
│ └── writers/ # Storage backends
├── examples/ # Example implementations
├── tests/ # Test suite
└── config.yaml # Pipeline configuration

Prerequisites:

  • Python 3.10 or higher
  • Docker and Docker Compose
  • MinIO (for local S3-compatible storage)

Installation Steps

Clone the repository and go to the project root:

git clone https://github.com/steelcake/cherry.git
cd cherry

Create and activate a virtual environment:

# Create virtual environment (all platforms)
python -m venv .venv

# Activate virtual environment

# For Windows with git bash:
source .venv/Scripts/activate

# For macOS/Linux:
source .venv/bin/activate

Install dependencies:

pip install -r requirements.txt

Set up environment variables:

Create a .env file in the project root Add your Hypersync API token:

Quick Start

  1. Create a config file (config.yaml):

  2. Run the script:

python main.py

Custom Processing Steps

To add a custom processing step, you need to:

  1. Define the step function
  2. Add the step to the context
  3. Add the step to the config

example: get_block_number_stats.py

def get_block_number_stats(data: Dict[str, pa.RecordBatch], step_config: Dict[str, Any]) -> Dict[str, pa.RecordBatch]:
    """Custom processing step for transfer events"""
    pass

config.yaml

steps:
  - name: my_get_block_number_stats
    kind: get_block_number_stats
    config:
      input_table: logs
      output_table: block_number_stats

Running the Project

Start MinIO server (for local S3 storage):

# Navigate to docker-compose directory
cd docker-compose

# Start MinIO using docker-compose
docker-compose up -d

# Return to project root
cd ..

Default credentials:

Access Key: minioadmin
Secret Key: minioadmin
Console URL: http://localhost:9001

Note: The MinIO service will be automatically configured with the correct ports and volumes as defined in the docker-compose.yml file.

Configure pipelines:

  • Open config.yaml
  • Adjust query, event filters, and batch sizes as needed for your pipeline
  • Configure writer settings (S3/local parquet etc.)

Run the indexer:

python main.py

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cherry_indexer-0.1.0.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cherry_indexer-0.1.0-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file cherry_indexer-0.1.0.tar.gz.

File metadata

  • Download URL: cherry_indexer-0.1.0.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.11

File hashes

Hashes for cherry_indexer-0.1.0.tar.gz
Algorithm Hash digest
SHA256 bbd7bce6fd1620d09836c5775b86c8e9ae2af422508443d14c36ba19a3732663
MD5 b6e9ef921af44329e0c589ff8d9da6c8
BLAKE2b-256 fcdf71f95c05cc011fae75f637973beb7b8e66fff0d1ee96e1a3d8cbe5a568af

See more details on using hashes here.

File details

Details for the file cherry_indexer-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: cherry_indexer-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 15.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.11

File hashes

Hashes for cherry_indexer-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e23dbd76b774c7b5dbd7f6962c6361314e949f7c2519a18af9f04e346088fe46
MD5 019c1012e7e4984fba205e89673df731
BLAKE2b-256 2f0ed1e41a0980aed6517cab2de36aeaf9b4d5868eec93ca46282925198773ab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page