A collection of blockchain data pipelines built with cherry
Project description
cherry-pipelines
This is a collection of pipelines that are built using cherry and ClickHouse materialized views.
All data is stored in ClickHouse.
Python version
This project is meant to be run with Python 3.12
If you are using uv for development it should pick this up automatically because of the .python-version in the project root.
The docker image is configured to use this version of Python as well.
Running a pipeline
Use the main script to run a pipeline:
uv run scripts/main.py
It takes these parameters as environment variables:
CHERRY_PIPELINE_KIND, "evm" or "svm".CHERRY_PIPELINE_NAME, name of the pipeline to run e.g. "erc20_transfers".CHERRY_FROM_BLOCK, specify the block that the indexing should start from. defaults to 0.CHERRY_TO_BLOCK, specify the block that the indexing should stop at. has no default. Indexing waits for new blocks when it reaches the tip of the chain if this argument is left empty.CHERRY_EVM_PROVIDER_KIND, specify which provider to use when indexing evm chains. Can behypersyncorsqd. Has no default and is required when indexing evm.CHERRY_EVM_CHAIN_ID, specify the chain_id when indexing an evm chain. has no default and is required when indexing evm.CHERRY_PROVIDER_BUFFER_SIZE, specify buffering between ingestion - processing - writer. Increasing this parameter might improve performance but can also cause higher memory usage. Defaults to 2.CHERRY_INIT_DB, It runs db setup script instead of the pipeline script if this is set to "true".CLICKHOUSE_HOST, defaults to127.0.0.1.CLICKHOUSE_PORT, defaults to8123.CLICKHOUSE_USER, defaults todefault.CLICKHOUSE_PASSWORD, defaults to empty string,RUST_LOGas explained in env-logger docsPY_LOGas explained in python logging docs. Defaults to "INFO"
An .env file placed in the project root can be used to define these for development.
Running with docker
We publish a docker image that runs the main script.
Dev Setup
Run the docker-compose file to start a clickhouse instance for development.
docker-compose up -d
Run this to delete the data on disk:
docker-compose down -v
And this to stop the container without deleting the data:
docker-compose down
Development
This repo uses uv for development.
- Format the code with
uv run ruff format - Lint the code with
uv run ruff check - Run type checks with
uv run pyright - Run the tests with
uv run pytest
Data Provider
All svm pipelines use SQD.
All evm pipelines are configurable using the CHERRY_EVM_PROVIDER_KIND env variable.
License
Licensed under either of
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
Sponsors
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cherry_pipelines-0.0.17.tar.gz.
File metadata
- Download URL: cherry_pipelines-0.0.17.tar.gz
- Upload date:
- Size: 86.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.7.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
550010b6609c9ffb6044af0e66cda69ef55f0deb49c2692f7b88c83994261249
|
|
| MD5 |
889f3dd72680dd2186c9b30c767a9316
|
|
| BLAKE2b-256 |
63566e10579f6fcd7d2fabaf33ebed8998352a3717130af0dc56e7cc3fcb814c
|
File details
Details for the file cherry_pipelines-0.0.17-py3-none-any.whl.
File metadata
- Download URL: cherry_pipelines-0.0.17-py3-none-any.whl
- Upload date:
- Size: 28.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.7.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7a6e5e6954a153b1010c63456051e9a13d4752e33b2ebf754f3012a43fd880d1
|
|
| MD5 |
cd58a4ed1f3fb32a128aee4e00c3c0cd
|
|
| BLAKE2b-256 |
5d4901481aedbe7ff888d98962d64b5ee582798dec5582b24a1eaffc133615b5
|