Helper to access to open-source gait datasets used by MaD-Lab
Project description
mad-datasets
Helper to access to open-source gait datasets of the MaD-Lab (and maybe externals in the future).
The aim of this package is to ensure that all datasets can be loaded in a similar fashion and all data (and annotations) are in the same format (i.e. the same sensor orientations, units, etc.). This should allow to easily run the same algorithm across multiple datasets.
:warning: While this makes it easier to work with the datasets, the coordinate system and other data information provided with the dataset might not match the format you get when using this library!
All datasets APIs are built using the
tpcp.Dataset
interface.
For available datasets see the table below.
Usage
Install the package from Pip
pip install mad-datasets
Then download/obtain the dataset that you are planning to use (see below). The best way to get started is to then check the example for the respective dataset on the documentation page.
Datasets
Dataset | Info Link | Download |
---|---|---|
EgaitSegmentationValidation2014 | https://www.mad.tf.fau.de/research/activitynet/digital-biobank/ | Email to data owner (see info link) |
EgaitParameterValidation2013 | https://www.mad.tf.fau.de/research/activitynet/digital-biobank/ | Email to data owner (see info link) |
StairAmbulationHealthy2021 | https://osf.io/sgbw7/ | https://osf.io/download/5ueq6/ |
SensorPositionDataset2019 | https://zenodo.org/record/5747173 | https://zenodo.org/record/5747173 |
Testing
The /tests
directory contains a set of tests to check the functionality of the library.
However, most tests rely on the existence of the respective datasets in certain folders outside the library.
Therefore, the tests can only be run locally and not on the CI server.
To run them locally, make sure datasets are downloaded into the correct folders and then run poe test
.
Documentation (build instructions)
Like the tests, the documentation requires the datasets to be downloaded into the correct folders to execute the
examples.
Therefore, we can not build the docs automatically on RTD.
Instead we host the docs via github pages.
The HTML source can be found in the gh-pages
branch of this repo.
To make the deplowment as easy as possible, we "mounted" the gh-pages
branch as a submodule in the docs/_build/html
folder.
Hence, before you attempt to build the docs, you need to initialize the submodule.
git submodule update --init --recursive
After that you can run poe docs
to build the docs and then poe upload_docs
to push the changes to the gh-pages
branch.
We will always just update a single commit on the gh-pages branch to keep the effective file size small.
**WARNING: ** Don't delete the docs/_build
folder manually or by running the sphinx make file!
This will delete the submodule and might cause issues.
The poe
task is configured to clean all relevant files in the docs/_build
folder before each run.
After an update of the documentation, you will see that you also need to make a commit in the main repo, as the commit hash of the docs submodule has changed.
To make sure you don't forget to update the docs, the poe prepare_release
task will also build and upload the docs
automatically.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mad_datasets-0.4.0.tar.gz
.
File metadata
- Download URL: mad_datasets-0.4.0.tar.gz
- Upload date:
- Size: 30.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.1 CPython/3.11.0 Linux/5.15.0-1024-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5336931254ffef461168068bfd2c350e9208010e5862c9c8c1d95fa4f4def28a |
|
MD5 | 47f2115e20dacad567864ffd8f062d56 |
|
BLAKE2b-256 | ca53faf1137bfe667e418c83b52422d7cdef43ea82e366ff09b55c689b51528e |
File details
Details for the file mad_datasets-0.4.0-py3-none-any.whl
.
File metadata
- Download URL: mad_datasets-0.4.0-py3-none-any.whl
- Upload date:
- Size: 38.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.1 CPython/3.11.0 Linux/5.15.0-1024-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8de51c059f0e2b9634b5001cbfb7d9574d9ee85605af25941600e5bf56376fa0 |
|
MD5 | cee3f33c9a2eac871b3e900d365fc48e |
|
BLAKE2b-256 | c71a584125e430c7311aae6d5d1c1bf55a4aabaa27472c724904cf113c225b43 |