Skip to main content

A Python SDK for Tinybird

Project description

Verdin

PyPI Version CI Status Coverage Status PyPI License Code style: black

Verdin is a tiny bird, and also a Tinybird SDK for Python.

Install

pip install verdin

Requirements

Python 3.10+

Usage

Run an SQL Query

# the tinybird module exposes all important tinybird concepts
from verdin import tinybird

client = tinybird.Client("p.mytoken")
query = client.sql("select * from my_datasource__v0")

# run the query with `FORMAT JSON` and receive a QueryJsonResult
response: tinybird.QueryJsonResult = query.json()

# print records returned from the pipe
print(response.data)

You can also run, e.g., query.get(format=OutputFormat.CSV) to get the raw response with CSV data.

Query a Pipe

from verdin import tinybird

client = tinybird.Client("p.mytoken")
pipe = client.pipe("my_pipe")

# query the pipe using dynamic parameters
response: tinybird.PipeJsonResponse = pipe.query({"key": "val"})

# print records returned from the pipe
print(response.data)

Append to a data source

from verdin import tinybird

client = tinybird.Client("p.mytoken")

# will access my_datasource__v0
datasource = client.datasource("my_datasource", version=0)

# query the pipe using dynamic parameters
datasource.append([
    ("col1-row1", "col2-row1"),
    ("col1-row2", "col2-row2"),
])

Append to a data source using high-frequency ingest

The DataSource object also gives you access to /v0/events, which is the high-frequency ingest, to append data. Use the send_events method and pass JSON serializable documents to it.

datasource.send_events(records=[
    {"key": "val1"},
    {"key": "val2"},
    ...
])

Queue and batch records into a DataSource

Verdin provides a way to queue and batch data continuously:

from queue import Queue
from threading import Thread

from verdin import tinybird
from verdin.worker import QueuingDatasourceAppender

client = tinybird.Client("p.mytoken")

records = Queue()

appender = QueuingDatasourceAppender(records, client.datasource("my_datasource"))
Thread(target=appender.run).start()

# appender will regularly read batches of data from the queue and append them
# to the datasource. the appender respects rate limiting.

records.put(("col1-row1", "col2-row1"))
records.put(("col1-row2", "col2-row2"))

API access

The DataSource and Pipes objects presented so far are high-level abstractions that provide a convenience Python API to deal with the most common use cases. Verdin also provides more low-level access to APIs via client.api. The following APIs are available:

  • /v0/datasources: client.api.datasources
  • /v0/events: client.api.events
  • /v0/pipes: client.api.pipes
  • /v0/sql: client.api.query
  • /v0/tokens: client.api.tokens
  • /v0/variables: client.api.variables

Note that for some (datasources, pipes, tokens), manipulation operations are not implemented as they are typically done through tb deployments and not through the API.

Also note that API clients do not take care of retries or rate limiting. The caller is expected to handle fault tolerance.

Example (Querying a pipe)

You can query a pipe through the pipes API as follows:

from verdin import tinybird

client = tinybird.Client(...)

response = client.api.pipes.query(
    "my_pipe",
    parameters={"my_param": "..."},
    query="SELECT * FROM _ LIMIT 10",
)

for record in response.data:
    # each record is a dictionary
    ...

Example (High-frequency ingest)

You can use the HFI endpoint /v0/events through the events api. As records, you can pass a list of JSON serializable documents.

from verdin import tinybird

client = tinybird.Client(...)

response = client.api.events.send("my_datasource", records=[
    {"id": "...", "value": "..."},
    ...
])
assert response.quarantined_rows == 0

Develop

Create the virtual environment, install dependencies, and run tests

make venv
make test

Run the code formatter

make format

Upload the pypi package using twine

make upload

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

verdin-0.5.3.tar.gz (43.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

verdin-0.5.3-py3-none-any.whl (35.2 kB view details)

Uploaded Python 3

File details

Details for the file verdin-0.5.3.tar.gz.

File metadata

  • Download URL: verdin-0.5.3.tar.gz
  • Upload date:
  • Size: 43.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for verdin-0.5.3.tar.gz
Algorithm Hash digest
SHA256 6e30ea0367b1e21629174e09f3b46e9b641dea0412d97b0967fc78612d46ce35
MD5 bf59d18ba885f44c989ab5c76ca8faa1
BLAKE2b-256 ff1cac1e6c5630f32aee0d6391e2c03d11306800da65617f4ea8d7371640b2d5

See more details on using hashes here.

Provenance

The following attestation bundles were made for verdin-0.5.3.tar.gz:

Publisher: build.yml on localstack/verdin

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file verdin-0.5.3-py3-none-any.whl.

File metadata

  • Download URL: verdin-0.5.3-py3-none-any.whl
  • Upload date:
  • Size: 35.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for verdin-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a5fe437ca04f56e54eb69b315d674128fc7a317f86d81e6534162dc03a3b5aa6
MD5 52867deaf030bafef88569c6d7081f92
BLAKE2b-256 7cd4e96b47e4975eed55fa46243ff806fdf1a1ad7c5aab035cdafcef4f196575

See more details on using hashes here.

Provenance

The following attestation bundles were made for verdin-0.5.3-py3-none-any.whl:

Publisher: build.yml on localstack/verdin

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page