Skip to main content

A programme that queries apd.sensor endpoints and aggregates their results.

Project description

APD Sensor aggregator

A programme that queries apd.sensor endpoints and aggregates their results.

Generic single-database configuration.

Database setup

To generate the required database tables you must create an alembic.ini file, as follows:

[alembic]
script_location = apd.aggregation:alembic
sqlalchemy.url = postgresql+psycopg2://apd@localhost/apd

and run alembic upgrade head. This should also be done after every upgrade of the software.

Defining endpoints

Endpoints to collect from are managed with the sensor_deployments CLI tool. After installation there will be no deployments defined

sensor_deployments add --db postgresql+psycopg2://apd@localhost/apd 
                       --api-key 97f6b3e5ceb64a6ba88968d7c3786b38
                       --colour xkcd:red
                       http://rpi4:8081
                       Loft

The optional colour argument is the colour to use when plotting charts with the built-in charting tools. This uses matplotlib's colour specification system, documented at https://matplotlib.org/tutorials/colors/colors.html

The sensors can then be listed with sensor_deployments list:

Loft
ID 53998a5160de48aeb71a5c37cd1455f2
URI http://rpi4:8081
API key 97f6b3e5ceb64a6ba88968d7c3786b38
Colour xkcd:red

The ID is the deployment ID, as set by the endpoint. It is only possible to add endpoints if they can be connected to at the time.

Collating data

Data can be collated from all defined endpoints with the collect_sensor_data command line tool. Although you can specify URLs and an API key to explicitly load data from a one-off endpoint, running without specifying these will use the configured endpoints from the database.

collect_sensor_data --db postgresql+psycopg2://apd@localhost/apd

Viewing data

You can write scripts to visualise the data from the database. I recommend using Jupyter for this, as it has good support for drawing charts and interactivity.

All configured charts can be displayed with:

from apd.aggregation.analysis import plot_multiple_charts
display(await plot_multiple_charts())

More complex charting can be achieved by passing configs= to this function, consisting of configuration objects as defined in apd.aggregation.analysis. Iteractivity can be achieved using the interactable_plot_multiple_charts function with Jupyter/IPyWidgets' existing interactivity support.

More control can be achieved using other functions from this module, such as getting all data points from a given sensor with:

from apd.aggregation.query import with_database, get_data

with with_database("postgresql+psycopg2://apd@localhost/apd") as session:
    points = [(dp.collected_at, dp.data) async for dp in get_data() if dp.sensor_name=="RelativeHumidity"]

These can be called from any Python code, not just Jupyter notebooks

Analysis and triggers

The aggregator allows for a long-running process that processes records as they are inserted to the database and apply rules to them.

This is configured with a Python-based configuration file, such as the following to log any time the Temperature fluctuates above or below 18c:

import operator

from apd.aggregation.actions.action import OnlyOnChangeActionWrapper, LoggingAction
from apd.aggregation.actions.runner import DataProcessor
from apd.aggregation.actions.trigger import ValueThresholdTrigger


handlers = [
    DataProcessor(
        name="TemperatureBelow18",
        action=OnlyOnChangeActionWrapper(LoggingAction()),
        trigger=ValueThresholdTrigger(
            name="TemperatureBelow18",
            threshold=18,
            comparator=operator.lt,
            sensor_name="Temperature",
        ),
    )
]

This is run with:

run_apd_actions --db postgresql+psycopg2://apd@localhost/apd sample_actions.py

The optional --historical option causes the actions to be triggered for all events in the database. If it's omitted then the default behaviour applies, which is to only analyse data that is added to the database after the actions process has started.

The possible actions are:

  • apd.aggregation.actions.action.LoggingAction() - Log data points
  • apd.aggregation.actions.action.SaveToDatabaseAction() - Save data points to the db

These can be wrapped with OnlyOnChangeActionWrapper(subaction) to only trigger an action when the underlying value changes and/or with OnlyAfterDateActionWrapper(subaction, min_date) to only trigger if the date on the discovered objects is strictly after min_date.

The possible triggers are:

  • apd.aggregation.actions.trigger.ValueThresholdTrigger(...) - This compares the value of a sensor with threshold, using the specified comparator. Any records that don't match the sensor_name and deployment_id parameters are excluded.

Tips

The --db argument to all command-line tools can be omitted and the APD_DB_URI environment variable set instead.

Changes

1.0.1 (2020-05-21)

  • Re-release of 1.0.0 with renamed database tables. These were changed for clarity during the book's review process, and therefore were not appropriate for a migration. The only user of this code is believed to be the author. Please delete your database if you were using 1.0.0, or rename the tables and indexes manually.

1.0.0 (2020-01-27)

  • Added management of known sensor endpoints
  • Added CLI script to collate data
  • Added analysis tools for Jupyter
  • Added long-running data synthesis and actions system

Copyright (c) 2019, Matthew Wilkes

All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  • Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

  • Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apd.aggregation-1.0.1.tar.gz (27.0 kB view details)

Uploaded Source

Built Distribution

apd.aggregation-1.0.1-py3-none-any.whl (31.4 kB view details)

Uploaded Python 3

File details

Details for the file apd.aggregation-1.0.1.tar.gz.

File metadata

  • Download URL: apd.aggregation-1.0.1.tar.gz
  • Upload date:
  • Size: 27.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/42.0.1 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.0

File hashes

Hashes for apd.aggregation-1.0.1.tar.gz
Algorithm Hash digest
SHA256 e80d7963ff51c290f2989d0e90adcae296ce8c073a523567f381396728fb115d
MD5 3192008a621a2de5c18cbce56adff42b
BLAKE2b-256 8040cb1ee417009fee1ca2d8d1fd54af296a40503a6432b3f85da871e248a35f

See more details on using hashes here.

File details

Details for the file apd.aggregation-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: apd.aggregation-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 31.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/42.0.1 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.0

File hashes

Hashes for apd.aggregation-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 17edd0d9cc5031974fb2a14df0f0723ce43ff6ef9a208e6e3d71d16d5ffeefe5
MD5 de160da58510e50567308376cac47960
BLAKE2b-256 7ef657968e947d090642f8d9066ba132d4fa754a801f2e74a24b0a1fe506a21a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page