Skip to main content

airQ - Air Quality monitoring data ( for India ) collection system, written in Python3.

Project description

airQ v0.3.3

A near real time Air Quality Indication Data Collection Service ( for India ), made with :heart:

Consider putting :star: to show love & support

Companion repo located at : airQ-insight, to power visualization

what does it do ?

  • Air quality data collector, collected from 180+ ground monitoring stations ( spread across India )
  • Unreliable JSON dataset is fetched from here, which gives current hour's pollutant statistics, from all monitoring station(s), spread across India, which are then objectified, cleaned, processed & restructured into proper format and pushed into *.json file
  • Air quality data, given by minimum, maximum & average presence of pollutants such as PM2.5, PM10, CO, NH3, SO2, OZONE & NO2, along with timeStamp, grouped under stations ( from where these were collected )
  • Automated data collection done using systemd ( hourly )


airQ can easily be installed from PyPI using pip.

$ pip install airQ --user # or may be use pip3
$ python3 -m pip install airQ --user # if previous one doesn't work


After installing airQ, run it using following command

$ cd # currently at $HOME
$ airQ # improper invokation
airQ - Air Quality Data Collector

	$ airQ `sink-file-path_( *.json )_`

 For making modifications on airQ-collected data
 ( collected prior to this run ),
 pass that JSON path, while invoking airQ ;)

Bad Input
$ airQ ./data/data.json # proper invokation


  • Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
  • And for that, I've used systemd, which will use a systemd.timer to trigger execution of airQ every hour i.e. after a delay of 1h, counted from last execution of airQ, periodically
  • For that we'll require to add two files, *.service & *.timer ( placed in ./systemd/ )


Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in [Unit] section, it's declared it Wants, airQ.timer

Description=Air Quality Data collection service

You need to set absolute path of current working directory in WorkingDirectory field of [Service] unit declaration

ExecStart is the command, to be executed when this service unit is invoked by airQ.timer, so absolute installation path of airQ and absolute sink path ( *.json ) is required

Make sure you update User field, to reflect changes properly, as per your system.

If you just add a Restart field under [Service] unit & give it a value always, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using systemd.timer, pretty much like cron, but much more used & supported in almost all linux based distros

ExecStart=/absolute-path-to-airQ /home/user/data/data.json

This declaration, makes this service a required dependency for



Pretty much same as airQ.service, only Requires, airQ.service as one strong dependency, because that's the service which is to be run when this timer expires

Description=Air Quality Data collection service

Unit field specifies which service file to execute when timer expires. You can simply skip this field, if you have created a ./systemd/*.service file of same name as ./systemd/*.timer

As we're interested in running this service every 1h ( relative to last execution of airQ.service ), we've specified OnUnitActiveSec field to be 1h


Makes it an dependency of, so that this timer can be installed


automation in ACTION

Need to place files present ./systemd/* into /etc/systemd/system/, so that systemd can find these service & timer easily.

$ sudo cp ./systemd/* /etc/systemd/system/

We need to reload systemd daemon, to let it explore newly added service & timer unit(s).

$ sudo systemctl daemon-reload

Lets enable our timer, which will ensure our timer will keep running even after system reboot

$ sudo systemctl enable airQ.timer

Time to start this timer

$ sudo systemctl start airQ.timer

So an immediate execution of our script to be done, and after completion of so, it'll again be executed 1h later, so that we get refreshed dataset.

Check status of this timer

$ sudo systemctl status airQ.timer

Check status of this service

$ sudo systemctl status airQ.service

Consider running your instance of airQ on Cloud, mine running on AWS LightSail


This service is supposed to only collect data & properly structure it, but visualization part is done at airQ-insight

Hoping it helps :wink:

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for airQ, version 0.3.3
Filename, size File type Python version Upload date Hashes
Filename, size airQ-0.3.3-py3-none-any.whl (12.7 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size airQ-0.3.3.tar.gz (9.9 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page