Skip to main content

airQ - Air Quality monitoring data ( for India ) collection system, written in Python3.

Project description

airQ v0.3.3

A near real time Air Quality Indication Data Collection Service ( for India ), made with :heart:

Consider putting :star: to show love & support

Companion repo located at : airQ-insight, to power visualization

what does it do ?

  • Air quality data collector, collected from 180+ ground monitoring stations ( spread across India )
  • Unreliable JSON dataset is fetched from here, which gives current hour's pollutant statistics, from all monitoring station(s), spread across India, which are then objectified, cleaned, processed & restructured into proper format and pushed into *.json file
  • Air quality data, given by minimum, maximum & average presence of pollutants such as PM2.5, PM10, CO, NH3, SO2, OZONE & NO2, along with timeStamp, grouped under stations ( from where these were collected )
  • Automated data collection done using systemd ( hourly )

installation

airQ can easily be installed from PyPI using pip.

$ pip install airQ --user # or may be use pip3
$ python3 -m pip install airQ --user # if previous one doesn't work

usage

After installing airQ, run it using following command

$ cd # currently at $HOME
$ airQ # improper invokation
airQ - Air Quality Data Collector

	$ airQ `sink-file-path_( *.json )_`

 For making modifications on airQ-collected data
 ( collected prior to this run ),
 pass that JSON path, while invoking airQ ;)

Bad Input
$ airQ ./data/data.json # proper invokation

automation

  • Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
  • And for that, I've used systemd, which will use a systemd.timer to trigger execution of airQ every hour i.e. after a delay of 1h, counted from last execution of airQ, periodically
  • For that we'll require to add two files, *.service & *.timer ( placed in ./systemd/ )

airQ.service

Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in [Unit] section, it's declared it Wants, airQ.timer

[Unit]
Description=Air Quality Data collection service
Wants=airQ.timer

You need to set absolute path of current working directory in WorkingDirectory field of [Service] unit declaration

ExecStart is the command, to be executed when this service unit is invoked by airQ.timer, so absolute installation path of airQ and absolute sink path ( *.json ) is required

Make sure you update User field, to reflect changes properly, as per your system.

If you just add a Restart field under [Service] unit & give it a value always, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using systemd.timer, pretty much like cron, but much more used & supported in almost all linux based distros

[Service]
User=anjan
WorkingDirectory=/absolute-path-to-current-working-directory/
ExecStart=/absolute-path-to-airQ /home/user/data/data.json

This declaration, makes this service a required dependency for multi-user.target

[Install]
WantedBy=multi-user.target

airQ.timer

Pretty much same as airQ.service, only Requires, airQ.service as one strong dependency, because that's the service which is to be run when this timer expires

[Unit]
Description=Air Quality Data collection service
Requires=airQ.service

Unit field specifies which service file to execute when timer expires. You can simply skip this field, if you have created a ./systemd/*.service file of same name as ./systemd/*.timer

As we're interested in running this service every 1h ( relative to last execution of airQ.service ), we've specified OnUnitActiveSec field to be 1h

[Timer]
Unit=airQ.service
OnUnitActiveSec=1h

Makes it an dependency of timers.target, so that this timer can be installed

[Install]
WantedBy=timers.target

automation in ACTION

Need to place files present ./systemd/* into /etc/systemd/system/, so that systemd can find these service & timer easily.

$ sudo cp ./systemd/* /etc/systemd/system/

We need to reload systemd daemon, to let it explore newly added service & timer unit(s).

$ sudo systemctl daemon-reload

Lets enable our timer, which will ensure our timer will keep running even after system reboot

$ sudo systemctl enable airQ.timer

Time to start this timer

$ sudo systemctl start airQ.timer

So an immediate execution of our script to be done, and after completion of so, it'll again be executed 1h later, so that we get refreshed dataset.

Check status of this timer

$ sudo systemctl status airQ.timer

Check status of this service

$ sudo systemctl status airQ.service

Consider running your instance of airQ on Cloud, mine running on AWS LightSail

visualization

This service is supposed to only collect data & properly structure it, but visualization part is done at airQ-insight

Hoping it helps :wink:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airQ-0.3.3.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

airQ-0.3.3-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file airQ-0.3.3.tar.gz.

File metadata

  • Download URL: airQ-0.3.3.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.21.0

File hashes

Hashes for airQ-0.3.3.tar.gz
Algorithm Hash digest
SHA256 86227a43ef81c19507123f22d2a3da91161a75552281437381bb5448ba9cc7e8
MD5 1e0dc42667029f336f84c7fb3ca1b4fd
BLAKE2b-256 5a8691d05531a212f458616adf1b35d50cba4040e8fda5a8ca1b68300f5ebc4f

See more details on using hashes here.

File details

Details for the file airQ-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: airQ-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.21.0

File hashes

Hashes for airQ-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b1dadfcd8f37154132ffd49ad0fddb5eb4bbfc89041fe13ee7ca059c4f83b940
MD5 6987c0a0fc58be5643bf949dbc0b5609
BLAKE2b-256 d2fc24e7a03c334ecb8821e3fd4b3c9c1eb7abb2a6ec3a0ebd7fbf9855559203

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page