Skip to main content

A Python SDK for Tinybird

Project description

Verdin

PyPI Version CI Status Coverage Status PyPI License Code style: black

Verdin is a tiny bird, and also a Tinybird SDK for Python.

Install

pip install verdin

Requirements

Python 3.10+

Usage

Run an SQL Query

# the tinybird module exposes all important tinybird concepts
from verdin import tinybird

client = tinybird.Client("p.mytoken")
query = client.sql("select * from my_datasource__v0")

# run the query with `FORMAT JSON` and receive a QueryJsonResult
response: tinybird.QueryJsonResult = query.json()

# print records returned from the pipe
print(response.data)

You can also run, e.g., query.get(format=OutputFormat.CSV) to get the raw response with CSV data.

Query a Pipe

from verdin import tinybird

client = tinybird.Client("p.mytoken")
pipe = client.pipe("my_pipe")

# query the pipe using dynamic parameters
response: tinybird.PipeJsonResponse = pipe.query({"key": "val"})

# print records returned from the pipe
print(response.data)

Append to a data source

from verdin import tinybird

client = tinybird.Client("p.mytoken")

# will access my_datasource__v0
datasource = client.datasource("my_datasource", version=0)

# query the pipe using dynamic parameters
datasource.append([
    ("col1-row1", "col2-row1"),
    ("col1-row2", "col2-row2"),
])

Append to a data source using high-frequency ingest

The DataSource object also gives you access to /v0/events, which is the high-frequency ingest, to append data. Use the send_events method and pass JSON serializable documents to it.

datasource.send_events(records=[
    {"key": "val1"},
    {"key": "val2"},
    ...
])

Queue and batch records into a DataSource

Verdin provides a way to queue and batch data continuously:

from queue import Queue
from threading import Thread

from verdin import tinybird
from verdin.worker import QueuingDatasourceAppender

client = tinybird.Client("p.mytoken")

records = Queue()

appender = QueuingDatasourceAppender(records, client.datasource("my_datasource"))
Thread(target=appender.run).start()

# appender will regularly read batches of data from the queue and append them
# to the datasource. the appender respects rate limiting.

records.put(("col1-row1", "col2-row1"))
records.put(("col1-row2", "col2-row2"))

API access

The DataSource and Pipes objects presented so far are high-level abstractions that provide a convenience Python API to deal with the most common use cases. Verdin also provides more low-level access to APIs via client.api. The following APIs are available:

  • /v0/datasources: client.api.datasources
  • /v0/events: client.api.events
  • /v0/pipes: client.api.pipes
  • /v0/sql: client.api.query
  • /v0/tokens: client.api.tokens
  • /v0/variables: client.api.variables

Note that for some (datasources, pipes, tokens), manipulation operations are not implemented as they are typically done through tb deployments and not through the API.

Also note that API clients do not take care of retries or rate limiting. The caller is expected to handle fault tolerance.

Example (Querying a pipe)

You can query a pipe through the pipes API as follows:

from verdin import tinybird

client = tinybird.Client(...)

response = client.api.pipes.query(
    "my_pipe",
    parameters={"my_param": "..."},
    query="SELECT * FROM _ LIMIT 10",
)

for record in response.data:
    # each record is a dictionary
    ...

Example (High-frequency ingest)

You can use the HFI endpoint /v0/events through the events api. As records, you can pass a list of JSON serializable documents.

from verdin import tinybird

client = tinybird.Client(...)

response = client.api.events.send("my_datasource", records=[
    {"id": "...", "value": "..."},
    ...
])
assert response.quarantined_rows == 0

Develop

Create the virtual environment, install dependencies, and run tests

make venv
make test

Run the code formatter

make format

Upload the pypi package using twine

make upload

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

verdin-0.5.2.tar.gz (42.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

verdin-0.5.2-py3-none-any.whl (34.7 kB view details)

Uploaded Python 3

File details

Details for the file verdin-0.5.2.tar.gz.

File metadata

  • Download URL: verdin-0.5.2.tar.gz
  • Upload date:
  • Size: 42.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.4

File hashes

Hashes for verdin-0.5.2.tar.gz
Algorithm Hash digest
SHA256 3aafc9b5d0f17448fe43de17387d80766befa7bf5d749e33381b6eb875382db7
MD5 9255bb6305c450d019c753ee66c0a123
BLAKE2b-256 28bf108c88e721007666b24777777c09bd532bbb9e2f5d11d6c5fdd9fd1cee99

See more details on using hashes here.

File details

Details for the file verdin-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: verdin-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 34.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.4

File hashes

Hashes for verdin-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ff9d3cf19abb1cee11443ec2facab69e277c29739ed63b11b69d815d0df82384
MD5 eab0c2b588ba028894d290e9b521fb7e
BLAKE2b-256 dc3614b9e897db0398c02a5ef06661b131628f47867d96a3be5192d415668121

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page