Skip to main content

MetaWards disease metapopulation modelling

Project description

MetaWards

Build status PyPI version PyPI Downloads GitHub commit activity License

For the most accurate and up to date information please visit the project website.

This is a Python port of the MetaWards package originally written by Leon Danon. The port is kept in sync with the original C code, with checks in place to ensure that the two codes give identical results. This improves the robustness of both codes, as it minimises the footprint to bugs that can evade both C and Python.

The aim of this port is to make it easier for others to contribute to the program, to improve robustness by adding in unit and integration test, and to also open up scope for further optimisation and parallelisation.

The package makes heavy use of cython which is used with OpenMP to compile bottleneck parts of the code to parallelised C. This enables this Python port to run at approximately the same speed as the original C program on one core, and to run several times faster across multiple cores.

The program compiles on any system that has a working C compiler that supports OpenMP, and a working Python >= 3.7. This include X86-64 and ARM64 servers.

The software supports running over a cluster using MPI (via mpi4py) or via simple networking (via scoop).

Full instructions on how to use the program, plus example job submission scripts can be found on the project website.

Data

The data and input parameters needed to use this package are stored in the MetaWardsData repository. Please make sure that you clone this repository to your computer and supply the full path to that repository to the program when it runs. There are three ways to do this;

  1. Set the METAWARDSDATA environment variable to point to this directory, e.g. export METAWARDSDATA=$HOME/GitHub/MetaWards

  2. Pass the repository variable to the input data classes Disease, InputFiles and Parameters

  3. Or simply make sure you clone into the directory $HOME/GitHub/MetaWardsData as this is the default path.

References

These are the references behind the original C code are;

  • "Individual identity and movement networks for disease metapopulations" Matt J. Keeling, Leon Danon, Matthew C. Vernon, Thomas A. House Proceedings of the National Academy of Sciences May 2010, 107 (19) 8866-8870; DOI: 10.1073/pnas.1000416107

  • "A spatial model of CoVID-19 transmission in England and Wales: early spread and peak timing" Leon Danon, Ellen Brooks-Pollock, Mick Bailey, Matt J Keeling medRxiv 2020.02.12.20022566; doi: 10.1101/2020.02.12.20022566

Dependencies

The code requires Python 3.7 or above. For development you will need cython and a working C compiler to build the code, plus pytest for running the tests.

Installation

Full installation instructions are here.

As you are here, I guess you want to install the latest code from GitHub ;-)

To do that, type;

git clone https://github.com/metawards/MetaWards
cd MetaWards
pip install -r requirements-dev.txt
CYTHONIZE=1 python setup.py build
CYTHONIZE=1 python setup.py install

Alternatively, you can also use the makefile, e.g.

make
make install

(assuming that python is version 3.7 or above)

You can run tests using pytest, e.g.

METAWARDSDATA="/path/to/MetaWardsData" pytest tests

You can generate the docs using

make docs

Running

Full usage instructions are here

You can either load and use the Python classes directly, or you can run the metawards front-end command line program that is automatically installed.

metawards --help

will print out all of the help for the program. For example;

metawards --input tests/data/ncovparams.csv --seed 15324 --nsteps 30 --nthreads 1

This will duplicate the run of the MetaWards C program that is bundled in this repository that was run using;

./original/metawards 15324 tests/data/ncovparams.csv 0 1.0

The original C code, command line and expected output are in the original directory that is bundled in this repo.

Running an ensemble

This program supports parallel running of an ensemble of jobs using multiprocessing for single-node jobs, and mpi4py or scoop for multi-node cluster jobs.

Full instructions for running on a cluster are here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metawards-1.1.0.tar.gz (2.3 MB view hashes)

Uploaded Source

Built Distributions

metawards-1.1.0-cp38-cp38-win_amd64.whl (1.8 MB view hashes)

Uploaded CPython 3.8 Windows x86-64

metawards-1.1.0-cp38-cp38-manylinux1_x86_64.whl (6.6 MB view hashes)

Uploaded CPython 3.8

metawards-1.1.0-cp38-cp38-macosx_10_14_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.8 macOS 10.14+ x86-64

metawards-1.1.0-cp37-cp37m-win_amd64.whl (1.8 MB view hashes)

Uploaded CPython 3.7m Windows x86-64

metawards-1.1.0-cp37-cp37m-manylinux1_x86_64.whl (6.3 MB view hashes)

Uploaded CPython 3.7m

metawards-1.1.0-cp37-cp37m-macosx_10_14_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.7m macOS 10.14+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page