Skip to main content

Retrieve air quality data from the Clarity.io API

Project description

clarityio

This package wraps the API for Clarity air quality sensors. It makes calls to v2 of the API, which as of July 2024 is the newest version of the API.

Development status

This package is in alpha status, currently being developed and subject to major changes.

Implemented endpoints

  • Recent measurements: POST {baseUrl}/v2/recent-datasource-measurements-query
  • Per-Org Datasources summary: GET {baseURl}/v2/datasources
  • Per-Datasource details: GET {baseURl}/v2/datasources/:datasourceId

Not yet implemented

  • Continuations
  • Historical measurements
  • All other endpoints.

Installation

Install from PyPI:

pip install clarityio

Usage

Initialize API connection

Find your API key and org in your Clarity.io user profile. Log in at https://dashboard.clarity.io, then click the person icon on the top-right corner.

Use these values to initialize a connection:

import clarityio
import pandas as pd
api_connection = clarityio.ClarityAPIConnection(api_key='YOUR_API_KEY', org='YOUR_ORG')

Retrieve recent measurements

See API docs for valid arguments to pass, e.g., retrieve daily data instead of hourly.

The default value of format is json-long, which returns the data in long format (one row per combination of metric and time). Here is such a call:

request_body = { # the required value for 'org' is automatically passed from the connection object
        'allDatasources': True,
        'outputFrequency': 'hour',
        'format': 'json-long',
        'startTime': '2024-07-22T00:00:00Z'
}
response = api_connection.get_recent_measurements(data=request_body)
df = pd.DataFrame(response['data'])

To get the data in wide format, with one row per timestamp and each metric in its own column, use the csv-wide format option and convert to a pandas dataframe:

request_body = {
        'allDatasources': True,
        'outputFrequency': 'hour',
        'format': 'csv-wide',
        'metricSelect': 'only pm2_5ConcMass24HourRollingMean' # Refer to API documentation for metric selection
}
response_wide = api_connection.get_recent_measurements(data=request_body)
from io import StringIO
df_wide = pd.read_csv(StringIO(response_wide), sep=",")

List data sources

datasources_response = api_connection.get_datasources()
datasources = pd.json_normalize(datasources_response['datasources'])

Get details for a specific data source

Obtain the IDs from the prior block of code.

source_details_response = api_connection.get_datasource_details('A_DATA_SOURCE_ID')
source_details = pd.json_normalize(source_details_response['datasource'])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clarityio-0.2.0.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

clarityio-0.2.0-py3-none-any.whl (4.0 kB view details)

Uploaded Python 3

File details

Details for the file clarityio-0.2.0.tar.gz.

File metadata

  • Download URL: clarityio-0.2.0.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for clarityio-0.2.0.tar.gz
Algorithm Hash digest
SHA256 bd32353fad44aa2fae78f4c26dcb7c8ce33fe53428894ddfbcde989b311eae2d
MD5 672a00fde01d87d0233ecd2ad4519e02
BLAKE2b-256 85bde3b5b5e773e20190921966de922eae024a458e9b89aedae8ecfe50471db1

See more details on using hashes here.

File details

Details for the file clarityio-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: clarityio-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 4.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for clarityio-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3d6f1cfe172b5e836753991fff83690f149aa40a1e6e23ac276661949f6eae2d
MD5 ef73b5f1af74de6d5ea0d52b492c2e50
BLAKE2b-256 ce3704cab860aa5dac0fb09d36734d9a852a6090f39592e9cbb81267b3e5b188

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page