Skip to main content

Library for the Piscada Cloud including authentication and data access.

Project description

Piscada Cloud

Library for the Piscada Cloud including authentication and data access.

Features

  • Login to Piscada Cloud and retrieve credentials
  • Persist credentialss locally
  • Read historic values for multiple tags as a Pandas DataFrame
  • Possible apply time-based linear interpolation to measurements
  • Utils to add fractional representations of periods: day, week, year

Install

Install from PyPI:

pip install piscada-cloud

or

poetry add piscada-cloud

Install from local source:

pip install --editable path/to/piscada_cloud

or

poetry add path/to/piscada_cloud

Usage

Authentication

To log-in interactively and persist the retrieved credentials on disk (under $HOME/.piscada_credentials) simply run:

python -m piscada_cloud.auth

or

poetry run python -m piscada_cloud.auth

Any future invocation, e.g. credentials = piscada_cloud.auth.persisted_login() will return the credentials on disk without user interaction.

credentials = piscada_cloud.auth.login(username, password, host) can be used to retrieve the credentials programmatically.

Getting Data

The credentials retrieved through the login can be used to get the host and acccesss-token for the historical data API:

from piscada_cloud import auth

credentials = auth.login_persisted()
host, token = auth.get_historian_credentials(credentials)

The host and token can be used to retrieve historic data as a Pandas DataFrame. The get_historic_values method takes a row of parameters:

  • start: Datetime object
  • end: Datetime object
  • tags: List of Tag objects
  • host (optional): Endpoint to which we send the historian queries. e.g. historian.piscada.online.
  • token (optional): Access token, associated with the endpoint, used for authentication.

The if the host or token arguments are not provided, the environment variables HISTORIAN_HOST and HISTORIAN_TOKEN are used in stead, respectively.

from datetime import datetime, timedelta, timezone

from piscada_cloud.data import get_historic_values
from piscada_cloud.mappings import Tag


tags = [
    Tag(controller_id="fe7bd2c3-6c20-44d4-aecc-df5822457400", name="ServerCpuUsage"),
    Tag(controller_id="fe7bd2c3-6c20-44d4-aecc-df5822457400", name="ServerMemoryUsage"),
]

df = get_historic_values(
    start=datetime.now(timezone.utc) - timedelta(days=30),
    end=datetime.now(timezone.utc),
    tags=tags
)

Write Data

In this example the column oCU135001RT90_MV is selected and the average value is calculated using the method .mean().

To write the result back to the Piscada Cloud, the data module offers the write_value function. It takes these arguments:

  • tag: A Tag object
  • value: The float, string, or dict value to write to the tag. Float and string will be sent as is, dict will be serialised as JSON string.
  • timestamp (optional): The timestamp in milliseconds since epoch at which to write the value, by default int(time.time() * 1000).
  • host: Endpoint to send post request. Overrides the default, which is os.environ['WRITEAPI_HOST'].
  • token: Access token accosiated with the host. Overrides the default, which is os.environ['WRITEAPI_TOKEN'].

The Tag.name must use the prefix py_ as this is the only namespace allowed for writing data via the API.

from piscada_cloud.data import write_value
from piscada_cloud.mappings import Tag


mean = df["oCU135001RT90_MV"].mean()
response = write_value(Tag(controller_id="0798ac4a-4d4f-4648-95f0-12676b3411d5", name="py_oCU135001RT90_MV_1h_mean"), value=mean)
if response.ok:
    print("OK")
else:
    print(response.text)

The response returned by the write_value method allows to check if the writing of data was successful response.ok == True.

Manipulations

In order to support analysis in the context of periodic patters, the manipulations allow you to add fractional representations of day, week, and year as additional columns in the DataFrame:

  • 00:00:00 -> 0.0 --- 23:59:59 -> 1.0
  • Monday 00:00:00 -> 0.0 --- Sunday 23:59:59 -> 1.0
  • 1st Jan. 00:00:00 -> 0.0 --- 31st Dec. 23:59:59 -> 1.0
from piscada_cloud import manipulations

manipulations.add_weekdays(data)
manipulations.add_day_fraction(data)
manipulations.add_week_fraction(data)
manipulations.add_year_fraction(data)

Documentation

Run MkDocs documentation:

poetry run mkdocs serve

Development

Enable the provided git pre commit hook: ln -s ./qa.sh .git/hooks/pre-commit

Requirements

The package will support the two latest version of Python.

Authors

License

© Piscada AS 2019

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

piscada_cloud-7.0.9.tar.gz (19.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

piscada_cloud-7.0.9-py3-none-any.whl (20.9 kB view details)

Uploaded Python 3

File details

Details for the file piscada_cloud-7.0.9.tar.gz.

File metadata

  • Download URL: piscada_cloud-7.0.9.tar.gz
  • Upload date:
  • Size: 19.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.7 Linux/5.15.0-1050-aws

File hashes

Hashes for piscada_cloud-7.0.9.tar.gz
Algorithm Hash digest
SHA256 1824dcf71b67046623de202a1e7bf8ebb680d2a71bcda07a4db417d93efbc446
MD5 693740b7ec4f4009a98d07b3c1f02ba7
BLAKE2b-256 d11bb3c4313955fc8e5dd6206763578b7b3a59fab043e26bb99c1e72fd0381bb

See more details on using hashes here.

File details

Details for the file piscada_cloud-7.0.9-py3-none-any.whl.

File metadata

  • Download URL: piscada_cloud-7.0.9-py3-none-any.whl
  • Upload date:
  • Size: 20.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.7 Linux/5.15.0-1050-aws

File hashes

Hashes for piscada_cloud-7.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 63d2a5a7f6a0dca44bbb6212b805ce855a31899d15f7185f00c99d469bc70104
MD5 9de58172fee4a107923afd04f97da3e7
BLAKE2b-256 0ff8f1bd306db6a2517aa0d63c39af393d5c459894775d4388df76879dedefff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page