Skip to main content

Python wrapper for DHIS2

Project description

PyPi version Downloads CircleCI build Appveyor build Test coverage Code quality Code maintainability

A Python library for DHIS2 wrapping requests for general-purpose API interaction with DHIS2. It attempts to be useful for any data/metadata import and export tasks including various utilities like file loading, UID generation and logging. A strong focus is on JSON.

Supported and tested on Linux/macOS, Windows and DHIS2 versions >= 2.25. Python 3.6+ is required.

1 Installation

Python 3.6+ is required.

pip install dhis2.py

For instructions on installing Python / pip for your operating system see realpython.com/installing-python.

2 Quickstart

Create an Api object:

from dhis2 import Api

api = Api('play.dhis2.org/demo', 'admin', 'district')

Then run requests on it:

r = api.get('organisationUnits/Rp268JB6Ne4', params={'fields': 'id,name'})

print(r.json())
# { "name": "Adonkia CHP", "id": "Rp268JB6Ne4" }

r = api.post('metadata', json={'dataElements': [ ... ] })
print(r.status_code) # 200
  • api.get()

  • api.post()

  • api.put()

  • api.patch()

  • api.delete()

see below for more methods.

They all return a Response object from requests except noted otherwise. This means methods and attributes are equally available (e.g. Response.url, Response.text, Response.status_code etc.).

3 Usage

3.1 Api instance creation

3.1.1 Authentication in code

Create an API object

from dhis2 import Api

api = Api('play.dhis2.org/demo', 'admin', 'district')

optional arguments:

  • api_version: DHIS2 API version

  • user_agent: submit your own User-Agent header. This is useful if you need to parse e.g. Nginx logs later.

3.1.2 Authentication from file

Load from a auth JSON file in order to not store credentials in scripts. Must have the following structure:

{
  "dhis": {
    "baseurl": "http://localhost:8080",
    "username": "admin",
    "password": "district"
  }
}
from dhis2 import Api

api = Api.from_auth_file('path/to/auth.json', api_version=29, user_agent='myApp/1.0')

If no file path is specified, it tries to find a file called dish.json in:

  1. the DHIS_HOME environment variable

  2. your Home folder

3.1.3 Get info about the DHIS2 instance

API version as a string:

print(api.version)
# '2.30'

API version as an integer:

print(api.version_int)
# 30

API revision / build:

print(api.revision)
# '17f7f0b'

API URL:

print(api.api_url)
# 'https://play.dhis2.org/demo/api/30'

Base URL:

print(api.base_url)
# 'https://play.dhis2.org/demo'

system info (this is persisted across the session):

print(api.info)
# {
#   "lastAnalyticsTableRuntime": "11 m, 51 s",
#   "systemId": "eed3d451-4ff5-4193-b951-ffcc68954299",
#   "contextPath": "https://play.dhis2.org/2.30",
#   ...

3.2 Getting things

Normal method: api.get(), e.g.

r = api.get('organisationUnits/Rp268JB6Ne4', params={'fields': 'id,name'})
data = r.json()

Parameters:

  • timeout: to override the timeout value (default: 5 seconds) in order to prevent the client to wait indefinitely on a server response.

3.2.1 Paging

Paging for larger GET requests via api.get_paged()

Two possible ways:

  1. Process every page as they come in:

for page in api.get_paged('organisationUnits', page_size=100):
    print(page)
    # { "organisationUnits": [ {...}, {...} ] } (100 organisationUnits)
  1. Load all pages before proceeding (this may take a long time) - to do this, do not use for and add merge=True:

all_pages = api.get_paged('organisationUnits', page_size=100, merge=True):
print(all_pages)
# { "organisationUnits": [ {...}, {...} ] } (all organisationUnits)

Note: Returns directly a JSON object, not a requests.Response object unlike normal GETs.

3.2.2 SQL Views

Get SQL View data as if you’d open a CSV file, optimized for larger payloads, via api.get_sqlview()

# poll a sqlView of type VIEW or MATERIALIZED_VIEW:
for row in api.get_sqlview('YOaOY605rzh', execute=True, criteria={'name': '0-11m'}):
    print(row)
    # {'code': 'COC_358963', 'name': '0-11m'}

# similarly, poll a sqlView of type QUERY:
for row in api.get_sqlview('qMYMT0iUGkG', var={'valueType': 'INTEGER'}):
    print(row)

# if you want a list directly, cast it to a ``list`` or add ``merge=True``:
data = list(api.get_sqlview('qMYMT0iUGkG', var={'valueType': 'INTEGER'}))
# OR
# data = api.get_sqlview('qMYMT0iUGkG', var={'valueType': 'INTEGER'}, merge=True)

Note: Returns directly a JSON object, not a requests.response object unlike normal GETs.

Beginning of 2.26 you can also use normal filtering on sqlViews. In that case, it’s recommended to use the stream=True parameter of the Dhis.get() method.

3.2.3 GET other content types

Usually defaults to JSON but you can get other file types:

r = api.get('organisationUnits/Rp268JB6Ne4', file_type='xml')
print(r.text)
# <?xml version='1.0' encoding='UTF-8'?><organisationUnit ...

r = api.get('organisationUnits/Rp268JB6Ne4', file_type='pdf')
with open('/path/to/file.pdf', 'wb') as f:
    f.write(r.content)

3.3 Updating / deleting things

Normal methods:

  • api.post()

  • api.put()

  • api.patch()

  • api.delete()

3.3.1 Post partitioned payloads

If you have such a large payload (e.g. metadata imports) that you frequently get a HTTP Error: 413 Request Entity Too Large response e.g. from Nginx you might benefit from using the following method that splits your payload in partitions / chunks and posts them one-by-one. You define the amount of elements in each POST by specifying a number in thresh (default: 1000).

Note that it is only possible to submit one key per payload (e.g. dataElements only, not additionally organisationUnits in the same payload).

api.post_partitioned()

import json

data = {
    "organisationUnits": [
        {...},
        {...} # very large number of org units
    ]
{
for response in api.post_partitioned('metadata', json=data, thresh=5000):
    text = json.loads(response.text)
    print('[{}] - {}'.format(text['status'], json.dumps(text['stats'])))

3.4 Multiple params with same key

If you need to pass multiple parameters to your request with the same key, you may submit as a list of tuples instead when e.g.:

r = api.get('dataValueSets', params=[
        ('dataSet', 'pBOMPrpg1QX'), ('dataSet', 'BfMAe6Itzgt'),
        ('orgUnit', 'YuQRtpLP10I'), ('orgUnit', 'vWbkYPRmKyS'),
        ('startDate', '2013-01-01'), ('endDate', '2013-01-31')
    ]
)

alternatively:

r = api.get('dataValueSets', params={
    'dataSet': ['pBOMPrpg1QX', 'BfMAe6Itzgt'],
    'orgUnit': ['YuQRtpLP10I', 'vWbkYPRmKyS'],
    'startDate': '2013-01-01',
    'endDate': '2013-01-31'
})

3.5 Utilities

3.5.1 Load JSON file

from dhis2 import load_json

json_data = load_json('/path/to/file.json')
print(json_data)
# { "id": ... }

3.5.2 Load CSV file

Via a Python generator:

from dhis2 import load_csv

for row in load_csv('/path/to/file.csv'):
    print(row)
    # { "id": ... }

Via a normal list, loaded fully into memory:

data = list(load_csv('/path/to/file.csv'))

3.5.3 Generate UID

Create a DHIS2 UID:

uid = generate_uid()
print(uid)
# 'Rp268JB6Ne4'

To create a list of 1000 UIDs:

uids = [generate_uid() for _ in range(1000)]

3.5.4 Validate UID

Check if something is a valid DHIS2 UID:

uid = 'MmwcGkxy876'
print(is_valid_uid(uid))
# True

uid = 25329
print(is_valid_uid(uid))
# False

uid = 'MmwcGkxy876 '
print(is_valid_uid(uid))
# False

3.5.5 Clean an object

Useful for deep-removing certain keys in an object, e.g. remove all sharing by recursively removing all user and userGroupAccesses fields.

from dhis2 import clean_obj

metadata = {
    "dataElements": [
        {
            "name": "ANC 1st visit",
            "id": "fbfJHSPpUQD",
            "publicAccess": "rw------",
            "userGroupAccesses": [
                {
                    "access": "r-r-----",
                    "userGroupUid": "Rg8wusV7QYi",
                    "displayName": "HIV Program Coordinators",
                    "id": "Rg8wusV7QYi"
                },
                {
                    "access": "rwr-----",
                    "userGroupUid": "qMjBflJMOfB",
                    "displayName": "Family Planning Program",
                    "id": "qMjBflJMOfB"
                }
            ]
        }
    ],
    "dataSets": [
        {
            "name": "ART monthly summary",
            "id": "lyLU2wR22tC",
            "publicAccess": "rwr-----",
            "userGroupAccesses": [
                {
                    "access": "r-rw----",
                    "userGroupUid": "GogLpGmkL0g",
                    "displayName": "_DATASET_Child Health Program Manager",
                    "id": "GogLpGmkL0g"
                }
            ]
        }
    ]
}


cleaned = clean_obj(metadata, ['userGroupAccesses', 'publicAccess'])
pretty_json(cleaned)

Which would eventually recursively remove all keys matching to userGroupAccesses or publicAccess:

{
  "dataElements": [
    {
      "name": "ANC 1st visit",
      "id": "fbfJHSPpUQD"
    }
  ],
  "dataSets": [
    {
      "name": "ART monthly summary",
      "id": "lyLU2wR22tC"
    }
  ]
}

3.5.7 Check import response

Check the importSummary response from e.g. /api/dataValues or /api/metadata import. Returns true if import went well, false if there are ignored values or the status reports not a OK or SUCCESS. This can be useful if the response returns a 200 OK but the summary reports ignored data.

from dhis2 import import_response_ok

# response as e.g. from response = api.post('metadata', data=payload).json()
response = {
    "description": "The import process failed: java.lang.String cannot be cast to java.lang.Boolean",
    "importCount": {
        "deleted": 0,
        "ignored": 1,
        "imported": 0,
        "updated": 0
    },
    "responseType": "ImportSummary",
    "status": "WARNING"
}

import_successful = import_response_ok(response)
print(import_successful)
# False

3.6 Logging

Logging utilizes logzero.

  • Color output depending on log level

  • DHIS2 log format including the line of the caller

  • optional logfile= specifies a rotating log file path (20 x 10MB files)

from dhis2 import setup_logger, logger

setup_logger(logfile='/var/log/app.log')

logger.info('my log message')
logger.warning('missing something')
logger.error('something went wrong')
logger.exception('with stacktrace')
* INFO  2018-06-01 18:19:40,001  my log message [script:86]
* ERROR  2018-06-01 18:19:40,007  something went wrong [script:87]

Use setup_logger(include_caller=False) if you want to remove [script:86] from logs.

3.7 Exceptions

There are two exceptions:

  • RequestException: DHIS2 didn’t like what you requested. See the exception’s code, url and description.

  • ClientException: Something didn’t work with the client not involving DHIS2.

They both inherit from Dhis2PyException.

4 Examples

  • Real-world script examples can be found in the examples folder.

  • dhis2.py is used in dhis2-pk (dhis2-pocket-knife)

5 Changelog

Versions changelog

6 Contribute

Feedback welcome!

  • Add issue

  • Install the dev environment (see below)

  • Fork, add changes to master branch, ensure tests pass with full coverage and add a Pull Request

pip install pipenv
git clone https://github.com/davidhuser/dhis2.py
cd dhis2.py
pipenv install --dev
pipenv run tests

# install pre-commit hooks
pipenv run pre-commit install

# run type annotation check
pipenv run mypy dhis2

# run flake8 style guide enforcement
pipenv run flake8

7 License

dhis2.py’s source is provided under MIT license. See LICENCE for details.

  • Copyright (c), 2020, David Huser

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dhis2.py-2.3.0.tar.gz (23.3 kB view details)

Uploaded Source

Built Distribution

dhis2.py-2.3.0-py2.py3-none-any.whl (17.5 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file dhis2.py-2.3.0.tar.gz.

File metadata

  • Download URL: dhis2.py-2.3.0.tar.gz
  • Upload date:
  • Size: 23.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.9.5

File hashes

Hashes for dhis2.py-2.3.0.tar.gz
Algorithm Hash digest
SHA256 3ee8f6654f006fb254f05ddae0993d0fc9e9120ee8c56b5a475a491733ce29c4
MD5 fe2a44bf7c962910f9a29ae87140712d
BLAKE2b-256 9ac1b7aa5fbe17a9ecbe6e3d794dd63928dbcf4f154097c5ea7a814852ed2b25

See more details on using hashes here.

File details

Details for the file dhis2.py-2.3.0-py2.py3-none-any.whl.

File metadata

  • Download URL: dhis2.py-2.3.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 17.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.9.5

File hashes

Hashes for dhis2.py-2.3.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 256426d4d15683618770549a6f9f48d20c43c95f0f9cdd8c2c29c1fd3a4ec2ae
MD5 98077892ca0cd662b87672d87ded2fbb
BLAKE2b-256 1351752b72f7755ee9fbce304b982d2d1fc4bec8cbf8847c253e6ca1d4819ce6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page