Skip to main content

Python client for the DeFiStream API

Project description

DeFiStream Python Client

Official Python client for the DeFiStream API.

Getting an API Key

To use the DeFiStream API, you need to sign up for an account at defistream.dev to obtain your API key.

Installation

pip install defistream

This includes pandas and pyarrow by default for DataFrame support.

With polars support (in addition to pandas):

pip install defistream[polars]

Quick Start

from defistream import DeFiStream

# Initialize client (reads DEFISTREAM_API_KEY from environment if not provided)
client = DeFiStream()

# Or with explicit API key
client = DeFiStream(api_key="dsk_your_api_key")

# Query ERC20 transfers using builder pattern
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

print(df.head())

Features

  • Builder pattern: Fluent query API with chainable methods
  • Type-safe: Full type hints and Pydantic models
  • Multiple formats: DataFrame (pandas/polars), CSV, Parquet, JSON
  • Async support: Native async/await with AsyncDeFiStream
  • All protocols: ERC20, AAVE, Uniswap, Lido, Stader, Threshold, Native tokens

Supported Protocols

Protocol Events
ERC20 transfers
Native Token transfers
AAVE V3 deposits, withdrawals, borrows, repays, flashloans, liquidations
Uniswap V3 swaps, deposits, withdrawals, collects
Lido deposits, withdrawal_requests, withdrawals_claimed, l2_deposits, l2_withdrawal_requests
Stader deposits, withdrawal_requests, withdrawals
Threshold deposit_requests, deposits, withdrawal_requests, withdrawals

Usage Examples

Builder Pattern

The client uses a fluent builder pattern. The query is only executed when you call a terminal method like as_df(), as_file(), or as_dict().

from defistream import DeFiStream

client = DeFiStream()

# Build query step by step
query = client.erc20.transfers("USDT")
query = query.network("ETH")
query = query.block_range(21000000, 21010000)
query = query.min_amount(1000)

# Execute and get DataFrame
df = query.as_df()

# Or chain everything
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .min_amount(1000)
    .as_df()
)

ERC20 Transfers

# Get USDT transfers over 10,000 USDT
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .min_amount(10000)
    .as_df()
)

# Filter by sender
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .sender("0x28c6c06298d514db089934071355e5743bf21d60")
    .as_df()
)

AAVE Events

# Get deposits
df = (
    client.aave.deposits()
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

# Get liquidations for a specific user
df = (
    client.aave.liquidations()
    .network("ETH")
    .block_range(21000000, 21010000)
    .user("0x...")
    .as_df()
)

Uniswap Swaps

# Get swaps for WETH/USDC pool with 0.05% fee tier
df = (
    client.uniswap.swaps("WETH", "USDC", 500)
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

# Or build with chain methods
df = (
    client.uniswap.swaps()
    .symbol0("WETH")
    .symbol1("USDC")
    .fee(500)
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

Native Token Transfers

# Get ETH transfers >= 1 ETH
df = (
    client.native_token.transfers()
    .network("ETH")
    .block_range(21000000, 21010000)
    .min_amount(1.0)
    .as_df()
)

Verbose Mode

By default, responses omit metadata fields to reduce payload size. Use .verbose() to include all fields:

# Default: compact response (no tx_hash, tx_id, log_index, network, name)
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

# Verbose: includes all metadata fields
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .verbose()
    .as_df()
)

Return as DataFrame

# As pandas DataFrame (default)
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

# As polars DataFrame
df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df("polars")
)

Save to File

Format is automatically determined by file extension:

# Save as Parquet (recommended for large datasets)
(
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21100000)
    .as_file("transfers.parquet")
)

# Save as CSV
(
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21100000)
    .as_file("transfers.csv")
)

# Save as JSON
(
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_file("transfers.json")
)

Return as Dictionary (JSON)

For small queries, you can get results as a list of dictionaries:

transfers = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_dict()
)

for transfer in transfers:
    print(f"{transfer['sender']} -> {transfer['receiver']}: {transfer['amount']}")

Note: as_dict() and as_file("*.json") use JSON format which has a 10,000 block limit. For larger block ranges, use as_df() or as_file() with .parquet or .csv extensions, which support up to 1,000,000 blocks.

Async Usage

import asyncio
from defistream import AsyncDeFiStream

async def main():
    async with AsyncDeFiStream() as client:
        df = await (
            client.erc20.transfers("USDT")
            .network("ETH")
            .block_range(21000000, 21010000)
            .as_df()
        )
        print(f"Found {len(df)} transfers")

asyncio.run(main())

Configuration

Environment Variables

export DEFISTREAM_API_KEY=dsk_your_api_key
export DEFISTREAM_BASE_URL=https://api.defistream.dev/v1  # optional
from defistream import DeFiStream

# API key from environment
client = DeFiStream()

# Or explicit
client = DeFiStream(api_key="dsk_...", base_url="https://api.defistream.dev/v1")

Timeout and Retries

client = DeFiStream(
    api_key="dsk_...",
    timeout=60.0,  # seconds
    max_retries=3
)

Error Handling

from defistream import DeFiStream
from defistream.exceptions import (
    DeFiStreamError,
    AuthenticationError,
    QuotaExceededError,
    RateLimitError,
    ValidationError
)

client = DeFiStream()

try:
    df = (
        client.erc20.transfers("USDT")
        .network("ETH")
        .block_range(21000000, 21010000)
        .as_df()
    )
except AuthenticationError:
    print("Invalid API key")
except QuotaExceededError as e:
    print(f"Quota exceeded. Remaining: {e.remaining}")
except RateLimitError as e:
    print(f"Rate limited. Retry after: {e.retry_after}s")
except ValidationError as e:
    print(f"Invalid request: {e.message}")
except DeFiStreamError as e:
    print(f"API error: {e}")

Response Headers

Access rate limit and quota information:

df = (
    client.erc20.transfers("USDT")
    .network("ETH")
    .block_range(21000000, 21010000)
    .as_df()
)

# Access response metadata
print(f"Rate limit: {client.last_response.rate_limit}")
print(f"Remaining quota: {client.last_response.quota_remaining}")
print(f"Request cost: {client.last_response.request_cost}")

Builder Methods Reference

Common Methods (all protocols)

Method Description
.network(net) Set network (ETH, ARB, BASE, OP, POLYGON, etc.)
.start_block(n) Set starting block number
.end_block(n) Set ending block number
.block_range(start, end) Set both start and end blocks
.start_time(ts) Set starting time (ISO format or Unix timestamp)
.end_time(ts) Set ending time (ISO format or Unix timestamp)
.time_range(start, end) Set both start and end times
.verbose() Include all metadata fields

Filter Methods

Method Protocols Description
.sender(addr) ERC20, Native Filter by sender address
.receiver(addr) ERC20, Native Filter by receiver address
.from_address(addr) ERC20, Native Alias for sender
.to_address(addr) ERC20, Native Alias for receiver
.min_amount(amt) ERC20, Native Minimum transfer amount
.token(symbol) ERC20 Token symbol (USDT, USDC, etc.)
.user(addr) AAVE Filter by user
.reserve(addr) AAVE Filter by reserve
.liquidator(addr) AAVE Liquidations Filter by liquidator
.symbol0(sym) Uniswap First token symbol
.symbol1(sym) Uniswap Second token symbol
.fee(tier) Uniswap Fee tier (100, 500, 3000, 10000)
.pool(addr) Uniswap Pool address

Terminal Methods

Method Description
.as_df() Execute and return pandas DataFrame
.as_df("polars") Execute and return polars DataFrame
.as_file(path) Execute and save to file (format from extension)
.as_file(path, format="csv") Execute and save with explicit format
.as_dict() Execute and return list of dicts (JSON, 10K block limit)

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

defistream-1.0.1.tar.gz (15.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

defistream-1.0.1-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file defistream-1.0.1.tar.gz.

File metadata

  • Download URL: defistream-1.0.1.tar.gz
  • Upload date:
  • Size: 15.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for defistream-1.0.1.tar.gz
Algorithm Hash digest
SHA256 d918ef41743019a7f184c7e5d2f39f746a86407095fa6a8ae7d814dd4b598a1f
MD5 ba23578008931b24e051d6c68a34d93e
BLAKE2b-256 cec47519cf6077dfb937f3b34715b709c8726ea53f7b5720a9b6cce596a72e0c

See more details on using hashes here.

File details

Details for the file defistream-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: defistream-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for defistream-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3570ffbb11316f517798056a2881f5faf9939d31ecbea0511dd0e52365901aa6
MD5 c044d74b99103d67be2d7ef2e1114d1c
BLAKE2b-256 4362dbca65e7c705f8bdfd7c6ff73a8d1be9c9057696b4d13fbae0c04da75340

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page