Skip to main content

Gateway facilitating asyncronous communication between sensory data-emitting devices, InfluxDB and the user.

Project description

async-httpd-data-collector

Interface handling the communication between sensory data-emitting devices, InfluxDB and the user.

The most important object that a user would use is DatabseInterface within ahttpdc.reads.interface module. This class facilitates the communication between the fetcher and the querying apis of InfluxDB.

In order to control fetching, there are two methods:

  • interface.daemon.enableg();
  • interface.daemon.disable().

Those methods control the thread within which fetching process is contained.

You can query data from the database using methods with query_ prefix. For now there are three:

  • interface.query_latest(), which queries the lastest measurement;
  • interface.query_historical(), which queries data from a given time range or relative time (eg. -3h);
  • interface.query(), which can takes user's given query as an argument.

Some examples will be presented below:

1.1 Connecting to the database

import json
from ahttpdc.reads.interface import DatabaseInterface

# load the secrets
with open('../../../secrets/secrets.json', 'r') as f:
    secrets = json.load(f)

# define sensors
sensors = {
    'bmp180': ['altitude', 'pressure', 'seaLevelPressure'],
    'mq135': ['aceton', 'alcohol', 'co', 'co2', 'nh4', 'toulen'],
    'ds18b20': ['temperature'],
    'dht22': ['humidity'],
}

# define the interface to the database
interface = DatabaseInterface(
    secrets['host'],
    secrets['port'],
    secrets['token'],
    secrets['organization'],
    secrets['bucket'],
    sensors,
    secrets['dev_ip'],
    80,
    secrets['handle'],
)

1.2 Extracting the dataframe from the database

import pandas as pd
import asyncio
from datetime import datetime, timedelta
from pathlib import Path

# if there is readings.csv file, load it
# if not - create it
readings_path = Path('../data/readings.csv')
if readings_path.is_file():
    sensor = pd.read_csv(readings_path)
else:
    sensor = await interface.query_historical('-30d')
    sensor.to_csv(readings_path)
sensor
time aceton alcohol altitude co co2 humidity nh4 pressure seaLevelPressure temperature toulen
0 2024-05-16 17:43:59.196399+00:00 0.41 1.17 149.92 3.38 402.54 37.4 3.93 999.35 1017.31 24.40 0.48
1 2024-05-16 17:44:01.768738+00:00 0.47 1.32 149.76 3.94 402.84 30.5 4.33 997.61 1015.56 24.03 0.55
2 2024-05-16 17:44:03.255309+00:00 0.96 2.62 149.54 9.16 405.25 49.1 7.35 999.14 1017.08 23.16 1.15
3 2024-05-16 17:44:04.618203+00:00 0.30 0.86 149.38 2.32 401.94 32.9 3.10 999.09 1017.02 23.05 0.35
4 2024-05-16 17:44:05.954714+00:00 1.31 3.50 149.37 13.13 406.82 48.8 9.21 998.04 1015.93 23.92 1.57
... ... ... ... ... ... ... ... ... ... ... ... ...
284122 2024-05-21 14:42:57.894312+00:00 1.35 3.62 150.08 13.68 407.03 47.6 9.46 998.85 1016.81 24.35 1.63
284123 2024-05-21 14:42:59.277937+00:00 1.08 2.92 149.87 10.48 405.79 49.3 8.00 998.58 1016.53 23.41 1.29
284124 2024-05-21 14:43:00.594968+00:00 0.38 1.09 149.97 3.09 402.38 33.8 3.71 999.59 1017.54 24.88 0.44
284125 2024-05-21 14:43:01.918239+00:00 1.41 3.77 150.13 14.38 407.29 44.4 9.76 998.51 1016.48 23.54 1.70
284126 2024-05-21 14:43:03.248095+00:00 1.24 3.32 150.50 12.29 406.50 48.8 8.84 998.85 1016.85 22.44 1.49

284127 rows × 12 columns

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

async_httpd_data_collector-2.0.1.tar.gz (40.9 kB view hashes)

Uploaded Source

Built Distribution

async_httpd_data_collector-2.0.1-py3-none-any.whl (37.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page