Skip to main content

Biobanking data processing, annotation, and association workflows

Project description

Biobanking

Systematic collection, processing, storage, and analysis of biological samples and associated health records for medical research.

Supported pipelines

Preprocess

Contains biobank-specific modules for EHR data collection, cleaning, and processing.

QC (Under construction)

Will contain biobank-specific modules for variant quality control and filtering.

Annotation (Under construction)

Will contain biobank-specific modules for variant annotation.

Association

Contains biobank-specific modules for genotype-phenotype association tests.

Supported biobanks

All of Us

The All of Us biobank consists of coupled whole genome sequencing and electronic health record data of more than 400k individuals, with continued expansion.

UK Biobank (Under construction)

The UK Biobank consists of coupled whole genome sequencing and electronic health record data of ~500k participants.

AoU REGENIE workflow

The All of Us association utilities currently support a packaged regenie workflow with three Step 2 modes:

  • Burden association testing
  • Mask-only runs for writing burden-mask PLINK datasets
  • Interaction testing using the same burden inputs and optional interaction flags

The workflow implementation lives in src/biobanking/workflows/regenie.wdl, and the Python utilities live in src/biobanking/association/aou.py.

The tracking model is phenotype-centered:

  • Step 1 is tracked once per phenotype prefix
  • Step 2 runs are tracked separately by mode
  • workflow metadata is written locally and synced to the workspace bucket

This keeps LOCO and prediction reuse aligned with the phenotype definition rather than with any specific burden or interaction run.

Recommended usage pattern

  • Run or reuse Step 1 once per phenotype prefix.
  • Use burden runs for standard gene-based tests.
  • Use mask runs to materialize chromosome-wide or gene-specific burden-mask PLINK files.
  • Use interaction runs only after Step 1 exists for the phenotype prefix you are testing.

More detailed usage examples are in docs/workflows.md.

Internal use

python -m pip install -U pip build
pip install twine
# linux
rm -rf dist build *.egg-info src/*.egg-info
# windows
Remove-Item -Recurse -Force dist, *.egg-info, src\*.egg-info
python -m build
pip install dist/biobanking-0.0.11-py3-none-any.whl
python -c "from biobanking.association.aou import REGENIE; regenie = REGENIE(); from biobanking.preprocess.aou.measurements import save_measurements_in_wide_format; print('import ok')"
twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biobanking-0.0.11.tar.gz (283.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

biobanking-0.0.11-py3-none-any.whl (328.4 kB view details)

Uploaded Python 3

File details

Details for the file biobanking-0.0.11.tar.gz.

File metadata

  • Download URL: biobanking-0.0.11.tar.gz
  • Upload date:
  • Size: 283.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for biobanking-0.0.11.tar.gz
Algorithm Hash digest
SHA256 3cb66d1144960a92ab793f25877bed069d4f24e49fc4a77c5767cf12677da1ea
MD5 cdceedc26f86f73d06cd5ef6b4236665
BLAKE2b-256 bb923f01884d4bbd546817b0e14d9bcdde6955e238af56a4dcc9e3d07650392c

See more details on using hashes here.

File details

Details for the file biobanking-0.0.11-py3-none-any.whl.

File metadata

  • Download URL: biobanking-0.0.11-py3-none-any.whl
  • Upload date:
  • Size: 328.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for biobanking-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 6578bf60605b7b25eb29e267949141dfce1bc9e6a46750b5bc9f93631ef9097e
MD5 92ea92d5135ec6a191c7fccab1c0da2a
BLAKE2b-256 5b1bd776a50d137a4987dab3d1c54461ecd98f747f11209c47c6ff131fa5178a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page