Skip to main content

No project description provided

Project description

Abstract Block Dumper

 Continuous Integration License python versions PyPI version

This package provides a simplified framework for creating block processing tasks in Django applications. Define tasks with lambda conditions using the @block_task decorator and run them asynchronously with Celery.

Usage

[!IMPORTANT] This package uses ApiVer, make sure to import abstract_block_dumper.v1.

Versioning

This package uses Semantic Versioning. TL;DR you are safe to use compatible release version specifier ~=MAJOR.MINOR in your pyproject.toml or requirements.txt.

Additionally, this package uses ApiVer to further reduce the risk of breaking changes. This means, the public API of this package is explicitly versioned, e.g. abstract_block_dumper.v1, and will not change in a backwards-incompatible way even when abstract_block_dumper.v2 is released.

Internal packages, i.e. prefixed by abstract_block_dumper._ do not share these guarantees and may change in a backwards-incompatible way at any time even in patch releases.

Implementation Details

General Workflow:

Register functions -> detect new blocks -> evaluate conditions -> send to Celery -> execute -> track results -> handle retries.

WorkflowSteps

  1. Register
  • Functions are automatically discovered when the scheduler starts
  • Functions must be located in installed apps in tasks.py or block_tasks.py
  • Functions marked with @block_task decorators are stored in memory registry
  1. Detect Blocks
  • Scheduler is running by management command block_tasks
  • Scheduler polls blockchain, finds new blocks, and batches them
  1. Plan Tasks
  • For each block, lambda conditions are evaluated against registered functions
  • Tasks are created for matching conditions (with optional multiple argument sets)
  1. Queue Tasks are sent to Celery with queue and timeout settings from celery_kwargs

  2. Execute Celery runs the function with block info, capturing results and errors

  3. Track Task attempts are stored in TaskAttempt model with retry logic and state tracking

Prerequisites

  • Django
  • Celery
  • Redis (for Celery broker and result backend)
  • PostgreSQL (recommended for production)

Installation

  1. Install the package:
pip install abstract_block_dumper
  1. Add to your Django INSTALLED_APPS:
INSTALLED_APPS = [
    # ... other apps
    'abstract_block_dumper',
]
  1. Run migrations:
python manage.py migrate

Usage

1. Define Block Processing Tasks

Create block processing tasks in tasks.py or block_tasks.py file inside any of your installed Django apps.

2. Use Decorators to Register Tasks

  • Use @block_task with lambda conditions to create custom block processing tasks

3. Start the Block Scheduler

Run the scheduler to start processing blocks:

$ python manage.py block_tasks_v1

This command will:

  • Automatically discover and register all decorated functions
  • Start polling the blockchain for new blocks
  • Schedule tasks based on your lambda conditions

4. Start Celery Workers

In separate terminals, start Celery workers to execute tasks:

$ celery -A your_project worker --loglevel=info

See examples below:

Use the @block_task decorator with lambda conditions to create block processing tasks:

from abstract_block_dumper.api.v1.decorators import block_task


# Process every block
@block_task(condition=lambda bn: True)
def process_every_block(block_number: int):
    print(f"Processing every block: {block_number}")

# Process every 10 blocks
@block_task(condition=lambda bn: bn % 10 == 0)
def process_every_10_blocks(block_number: int):
    print(f"Processing every 10 blocks: {block_number}")

# Process with multiple netuids
@block_task(
    condition=lambda bn, netuid: bn % 100 == 0,
    args=[{"netuid": 1}, {"netuid": 3}, {"netuid": 22}],
    backfilling_lookback=300,
    celery_kwargs={"queue": "high-priority"}
)
def process_multi_netuid_task(block_number: int, netuid: int):
    print(f"Processing block {block_number} for netuid: {netuid}")

Maintenance Tasks

Cleanup Old Task Attempts

The framework provides a maintenance task to clean up old task records and maintain database performance:

from abstract_block_dumper.v1.tasks import cleanup_old_tasks

# Delete tasks older than 7 days (default)
cleanup_old_tasks.delay()

# Delete tasks older than 30 days
cleanup_old_tasks.delay(days=30)

This task deletes all succeeded or unrecoverable failed tasks older than the specified number of days. It never deletes tasks with PENDING or RUNNING status to ensure ongoing work is preserved.

Running the Cleanup Task

Option 1: Manual Execution

# Using Django shell
python manage.py shell -c "from abstract_block_dumper.v1.tasks import cleanup_old_tasks; cleanup_old_tasks.delay()"

Option 2: Cron Job (Recommended - once per day)

# Add to crontab (daily at 2 AM)
0 2 * * * cd /path/to/your/project && python manage.py shell -c "from abstract_block_dumper.v1.tasks import cleanup_old_tasks; cleanup_old_tasks.delay()"

Option 3: Celery Beat (Automated Scheduling)

Add this to your Django settings.py:

from celery.schedules import crontab

CELERY_BEAT_SCHEDULE = {
    'cleanup-old-tasks': {
        'task': 'abstract_block_dumper.cleanup_old_tasks',
        'schedule': crontab(hour=2, minute=0),  # Daily at 2 AM
        'kwargs': {'days': 7},  # Customize retention period
    },
}

Then start the Celery beat scheduler:

celery -A your_project beat --loglevel=info

Configuration

Required Django Settings

Add these settings to your Django settings.py:

# Celery Configuration
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

# Abstract Block Dumper specific settings
BITTENSOR_NETWORK = 'finney'  # Options: 'finney', 'local', 'testnet', 'mainnet'
BLOCK_DUMPER_START_FROM_BLOCK = 'current'  # Options: None, 'current', or int
BLOCK_DUMPER_POLL_INTERVAL = 1  # seconds between polling for new blocks
BLOCK_TASK_RETRY_BACKOFF = 2  # minutes for retry backoff base
BLOCK_DUMPER_MAX_ATTEMPTS = 3  # maximum retry attempts
BLOCK_TASK_MAX_RETRY_DELAY_MINUTES = 1440  # maximum retry delay (24 hours)

Configuration Options Reference

Core Settings

BITTENSOR_NETWORK (str, default: 'finney') Specifies which Bittensor network to connect to

BLOCK_DUMPER_START_FROM_BLOCK (str|int|None, default: None)

  • Purpose: Determines the starting block for processing when the scheduler first runs
  • Valid Values:
    • None: Resume from the last processed block stored in database
    • 'current': Start from the current blockchain block (skips historical blocks)
    • int: Start from a specific block number (e.g., 1000000)
  • Example: BLOCK_DUMPER_START_FROM_BLOCK = 'current'
  • Performance Impact: Starting from historical blocks may require significant processing time

Scheduler Settings

BLOCK_DUMPER_POLL_INTERVAL (int, default: 1)

  • Purpose: Seconds to wait between checking for new blocks
  • Valid Range: 1 to 3600 (1 second to 1 hour)
  • Example: BLOCK_DUMPER_POLL_INTERVAL = 5
  • Performance Impact:
    • Lower values (1-2s): Near real-time processing, higher CPU/network usage
    • Higher values (10-60s): Reduced load but delayed processing
    • Very low values (<1s) may cause rate limiting

Retry and Error Handling Settings

BLOCK_DUMPER_MAX_ATTEMPTS (int, default: 3)

  • Purpose: Maximum number of attempts to retry a failed task before giving up
  • Valid Range: 1 to 10
  • Example: BLOCK_DUMPER_MAX_ATTEMPTS = 5
  • Performance Impact: Higher values increase resilience but may delay failure detection

BLOCK_TASK_RETRY_BACKOFF (int, default: 1)

  • Purpose: Base number of minutes for exponential backoff retry delays
  • Valid Range: 1 to 60
  • Example: BLOCK_TASK_RETRY_BACKOFF = 2
  • Calculation: Actual delay = backoff ** attempt_count minutes
    • Attempt 1: 2¹ = 2 minutes
    • Attempt 2: 2² = 4 minutes
    • Attempt 3: 2³ = 8 minutes
  • Performance Impact: Lower values retry faster but may overwhelm failing services

BLOCK_TASK_MAX_RETRY_DELAY_MINUTES (int, default: 1440)

  • Purpose: Maximum delay (in minutes) between retry attempts, caps exponential backoff
  • Valid Range: 1 to 10080 (1 minute to 1 week)
  • Example: BLOCK_TASK_MAX_RETRY_DELAY_MINUTES = 720 # 12 hours max
  • Performance Impact: Prevents extremely long delays while maintaining backoff benefits

Example Project

The repository includes a complete working example in the example_project/ directory that demonstrates:

  • Django application setup with abstract-block-dumper
  • Multiple task types (@every_block, @every_n_blocks with different configurations)
  • Error handling with a randomly failing task
  • Docker Compose setup with all required services
  • Monitoring with Flower (Celery monitoring tool)

Running the Example

cd example_project
docker-compose up --build

This starts:

  • Django application (http://localhost:8000) - Admin interface (user: admin, password: admin)
  • Celery workers - Execute block processing tasks
  • Block scheduler - Monitors blockchain and schedules tasks
  • Flower monitoring (http://localhost:5555) - Monitor Celery tasks
  • Redis & PostgreSQL - Required services

Development

Pre-requisites:

Ideally, you should run nox -t format lint before every commit to ensure that the code is properly formatted and linted. Before submitting a PR, make sure that tests pass as well, you can do so using:

nox -t check # equivalent to `nox -t format lint test`

If you wish to install dependencies into .venv so your IDE can pick them up, you can do so using:

uv sync --all-extras --dev

Release process

Run nox -s make_release -- X.Y.Z where X.Y.Z is the version you're releasing and follow the printed instructions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

abstract_block_dumper-0.0.2.tar.gz (229.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

abstract_block_dumper-0.0.2-py3-none-any.whl (23.5 kB view details)

Uploaded Python 3

File details

Details for the file abstract_block_dumper-0.0.2.tar.gz.

File metadata

  • Download URL: abstract_block_dumper-0.0.2.tar.gz
  • Upload date:
  • Size: 229.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for abstract_block_dumper-0.0.2.tar.gz
Algorithm Hash digest
SHA256 9032f086babd0613f534f7dd5e140c5917fa081819f9ac25a8858881d0d017a5
MD5 541caad4940d89419fdd155b38ace05a
BLAKE2b-256 991493b68ed28e224189f3988f99d4830b7e1386e93de4a59c953646fb4d94b3

See more details on using hashes here.

Provenance

The following attestation bundles were made for abstract_block_dumper-0.0.2.tar.gz:

Publisher: publish.yml on bactensor/abstract-block-dumper

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file abstract_block_dumper-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for abstract_block_dumper-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 78f812eba6d2dd14162575a5700d81a399a7e72a0cd67a183ac8a62da9d93f80
MD5 0ad5a672ae06e0f106fc6a4e902489e5
BLAKE2b-256 0a8b0ad5a53c003f6b1e57ab7a4e4fb4ce1d18e76cc7ae5b513d574cda50024e

See more details on using hashes here.

Provenance

The following attestation bundles were made for abstract_block_dumper-0.0.2-py3-none-any.whl:

Publisher: publish.yml on bactensor/abstract-block-dumper

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page