Skip to main content

Simple python client for extracting data from the Dune Analytics API

Project description

spice 🌶️

Simple python client for extracting data from the Dune Analytics API

Goals of spice:

  • simple, no OOP, entire api is just one function
  • support both sync and async workflows
  • tight integration with polars

To discuss spice, head to the Paradigm Data Tools Telegram channel.

Table of Contents

  1. Installation
  2. Examples
    1. Sync Workflow
    2. Async Workflow
  3. API Reference
  4. FAQ

Installation

pip install dune_spice

Examples

Can either use the sync workflow or async workflow. Each workflow has only one function.

See API Reference below for the full list of query function arguments.

Sync Workflow

import spice

# get most recent query results using query id
df = spice.query(21693)

# get most recent query results using query url
df = spice.query('https://dune.com/queries/21693')

# get most recent query results using raw sql
df = spice.query('SELECT * FROM ethereum.blocks LIMIT 5')

# perform new query execution and get results
df = spice.query(query, refresh=True)

# get query results for input parameters
df = spice.query(query, parameters={'network': 'ethereum'})

# perform new query execution, but do not wait for result
execution = spice.query(query, poll=False)

# get results of previous execution
df = spice.query(execution)

Async Workflow

The async API is identical to the sync API as above, just add async_ prefix.

df = await spice.async_query(21693)
df = await spice.async_query('https://dune.com/queries/21693')
df = await spice.async_query('SELECT * FROM ethereum.blocks LIMIT 5')
df = await spice.async_query(query, refresh=True)
df = await spice.async_query(query, parameters={'network': 'ethereum'})
execution = await spice.async_query(query, poll=False)
df = await spice.async_query(execution)

Quality of Life

spice contains additional quality of life features such as:

  • automatically handle pagination of multi-page results
  • automatically execute queries that have no existing executions, especially when using new parameter values
  • allow type overrides using the dtypes parameter

API Reference

Types

from typing import Any, Literal, Mapping, Sequence, TypedDict
import polars as pl

# query is an int id or query url
Query = int | str

# execution performance level
Performance = Literal['medium', 'large']

# execution
class Execution(TypedDict):
    execution_id: str

Functions

These functions are accessed as spice.query() and spice.aysnc_query().

def query(
    query_or_execution: Query | Execution,
    *,
    verbose: bool = True,
    refresh: bool = False,
    max_age: int | float | None = None,
    parameters: Mapping[str, Any] | None = None,
    api_key: str | None = None,
    performance: Performance = 'medium',
    poll: bool = True,
    poll_interval: float = 1.0,
    limit: int | None = None,
    offset: int | None = None,
    sample_count: int | None = None,
    sort_by: str | None = None,
    columns: Sequence[str] | None = None,
    extras: Mapping[str, Any] | None = None,
    dtypes: Sequence[pl.DataType] | None = None,
) -> pl.DataFrame | Execution:
    """get results of query as dataframe

    # Parameters
    - query_or_execution: query or execution to retrieve results of
    - verbose: whether to print verbose info
    - refresh: trigger a new execution, or just use most recent execution
    - max_age: max age of last execution in seconds, or trigger a new execution
    - parameters: dict of query parameters
    - api_key: dune api key, otherwise use DUNE_API_KEY env var
    - performance: performance level
    - poll: wait for result as DataFrame, or just return Execution handle
    - poll_interval: polling interval in seconds
    - limit: number of rows to query in result
    - offset: row number to start returning results from
    - sample_count: number of random samples from query to return
    - sort_by: an ORDER BY clause to sort data by
    - columns: columns to retrieve, by default retrieve all columns
    - extras: extra parameters used for fetching execution result
        - examples: ignore_max_datapoints_per_request, allow_partial_results
    - dtypes: dtypes to use in output polars dataframe
    """
    ...

async def async_query(
    # all the same parameters as query()
    ...
) -> pl.DataFrame | Execution:
    """get results of query as dataframe, asynchronously

    ## Parameters
    [see query()]
    """
    ...

FAQ

How do I set my Dune API key?

spice looks for a Dune api key in the DUNE_API_KEY environment variable.

Which endpoints does this package support?

spice interacts only with Dune's SQL-related API endpoints, documented here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dune_spice-0.1.12.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

dune_spice-0.1.12-py3-none-any.whl (14.7 kB view details)

Uploaded Python 3

File details

Details for the file dune_spice-0.1.12.tar.gz.

File metadata

  • Download URL: dune_spice-0.1.12.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.29.0

File hashes

Hashes for dune_spice-0.1.12.tar.gz
Algorithm Hash digest
SHA256 52dc477c15bc66d8fef457ba7cd5e1f73b4914dcdd73991cd3715164817ec1e0
MD5 855c0cbdb04913fc4e334261e5811d47
BLAKE2b-256 fbeead55867abc39d256c6cc49491472d54bf724f91084a135cac1d4bbd7c969

See more details on using hashes here.

File details

Details for the file dune_spice-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: dune_spice-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 14.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.29.0

File hashes

Hashes for dune_spice-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 98ffc7c75d9a011c5364c6c9a60abc7ebe2c8808e45dcf59c918f6560e64f378
MD5 8c648c1b643833769612164f8222933d
BLAKE2b-256 1328cdf6ca9cf4f32728a8c9da220bf8daab1d4feedefcd40c0eecc44b3d61f7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page