Gateway facilitating asynchronous communication between sensory data-emitting devices, InfluxDB and the user.
Project description
async-httpd-data-collector
Note: This is an older university project and is not actively maintained. It works as-is, but is not extensively tested.
A Python library that acts as an asynchronous gateway between IoT sensor devices and InfluxDB. It fetches JSON readings from a device (like a NodeMCU/ESP8266) over HTTP, parses them, and stores them as time-series data. You can then query the data back as pandas DataFrames.
Started as a university project to get hands-on with async Python,
asyncio, aiohttp, and time-series databases. The hardware side runs on
an arduino-air-state-server -
a NodeMCU board with air quality sensors (MQ135, BMP180, DHT22, DS18B20)
that exposes readings at an HTTP endpoint.
Installation
pip install async-httpd-data-collector
Quick Start
from ahttpdc.read.database_interface import DatabaseInterface
# define which sensors and parameters to track
sensors = {
'bmp180': ['altitude', 'pressure', 'seaLevelPressure', 'temperature'],
'mq135': ['aceton', 'alcohol', 'co', 'co2', 'nh4', 'toulen'],
'ds18b20': ['temperature'],
'dht22': ['humidity', 'temperature'],
}
# connect to InfluxDB and the device
interface = DatabaseInterface(
sensors,
db_host='localhost',
db_port=8086,
db_token='your-influxdb-token',
db_org='your-org',
db_bucket='your-bucket',
srv_ip='192.168.1.100', # device IP
srv_port=80,
handle='circumstances', # HTTP endpoint on the device
)
# start the background daemon - fetches and stores data continuously
interface.daemon.enable()
# query the last 30 days of data as a DataFrame
df = interface.query_historical('-30d')
print(df.head())
# stop the daemon when done
interface.daemon.disable()
How It Works
NodeMCU device async-httpd-data-collector InfluxDB
(sensors) (storage)
| |
|--- HTTP GET /circumstances --> AsyncFetcher |
| | |
| JSONInfluxParser |
| | |
| AsyncCollector --------> |
| |
| AsyncQuery <-------- |
| | |
| DataParser |
| | |
| pandas DataFrame |
The DatabaseInterface is the main entry point. It manages two things:
- DataDaemon - a background process (via
multiprocessing) that periodically fetches sensor data from the device and stores it in InfluxDB. - AsyncQuery - queries InfluxDB and returns results as pandas DataFrames with local timezone-adjusted timestamps.
Querying
# latest reading
df = interface.query_latest()
# last 3 hours
df = interface.query_historical('-3h')
# specific time range
df = interface.query_historical('2024-05-16T00:00:00Z', '2024-05-21T00:00:00Z')
# custom Flux query
df = interface.query_custom_sync('from(bucket:"my-bucket") |> range(start: -1d) |> last()')
Related Projects
- arduino-air-state-server - the NodeMCU firmware that collects sensor readings and serves them over HTTP
- air-quality-data-analysis - Jupyter notebooks with data analysis (heatmaps, correlations, anomaly detection) and SARIMAX time-series forecasting on the collected data
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file async_httpd_data_collector-2.0.3.tar.gz.
File metadata
- Download URL: async_httpd_data_collector-2.0.3.tar.gz
- Upload date:
- Size: 49.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f6a627231d9ab0b06836664c00ecee37485ca4d04f9a7c3d6b18cfba448c6b46
|
|
| MD5 |
0bf804d43b8cc9fdf08fbb94957ba835
|
|
| BLAKE2b-256 |
42a62f1f3bbebc6c26519b3962ab23914f6142ae7104e350231372d84103eca7
|
File details
Details for the file async_httpd_data_collector-2.0.3-py3-none-any.whl.
File metadata
- Download URL: async_httpd_data_collector-2.0.3-py3-none-any.whl
- Upload date:
- Size: 37.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e9c5e5e6ed60ca2f1d9ae08153c48eb9056ab9a5df148dc0d2ffc925377cefca
|
|
| MD5 |
f5d3dad1d50e3acac9b05f270cdcdae3
|
|
| BLAKE2b-256 |
b98f73a2af5a124c022e5b9afba0e9bf990fefd82c45fd7ca3e2e3afeb239193
|