Skip to main content

Biobanking data processing, annotation, and association workflows

Project description

Biobanking

Systematic collection, processing, storage, and analysis of biological samples and associated health records for medical research.

Supported pipelines

Preprocess

Contains biobank-specific modules for EHR data collection, cleaning, and processing.

QC (Under construction)

Will contain biobank-specific modules for variant quality control and filtering.

Annotation (Under construction)

Will contain biobank-specific modules for variant annotation.

Association

Contains biobank-specific modules for genotype-phenotype association tests.

Supported biobanks

All of Us

The All of Us biobank consists of coupled whole genome sequencing and electronic health record data of more than 400k individuals, with continued expansion.

UK Biobank (Under construction)

The UK Biobank consists of coupled whole genome sequencing and electronic health record data of ~500k participants.

AoU REGENIE workflow

The All of Us association utilities currently support a packaged regenie workflow with three Step 2 modes:

  • Burden association testing
  • Mask-only runs for writing burden-mask PLINK datasets
  • Interaction testing using the same burden inputs and optional interaction flags

The workflow implementation lives in src/biobanking/workflows/regenie.wdl, and the Python utilities live in src/biobanking/association/aou.py.

The tracking model is phenotype-centered:

  • Step 1 is tracked once per phenotype prefix
  • Step 2 runs are tracked separately by mode
  • workflow metadata is written locally and synced to the workspace bucket

This keeps LOCO and prediction reuse aligned with the phenotype definition rather than with any specific burden or interaction run.

Recommended usage pattern

  • Run or reuse Step 1 once per phenotype prefix.
  • Use burden runs for standard gene-based tests.
  • Use mask runs to materialize chromosome-wide or gene-specific burden-mask PLINK files.
  • Use interaction runs only after Step 1 exists for the phenotype prefix you are testing.

More detailed usage examples are in docs/workflows.md.

Internal use

python -m pip install -U pip build
pip install twine
# linux
rm -rf dist build *.egg-info src/*.egg-info
# windows
Remove-Item -Recurse -Force dist, *.egg-info, src\*.egg-info
python -m build
pip install dist/biobanking-0.0.10-py3-none-any.whl
python -c "from biobanking.association.aou import REGENIE; regenie = REGENIE(); from biobanking.preprocess.aou.measurements import save_measurements_in_wide_format; print('import ok')"
twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biobanking-0.0.10.tar.gz (282.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

biobanking-0.0.10-py3-none-any.whl (327.8 kB view details)

Uploaded Python 3

File details

Details for the file biobanking-0.0.10.tar.gz.

File metadata

  • Download URL: biobanking-0.0.10.tar.gz
  • Upload date:
  • Size: 282.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for biobanking-0.0.10.tar.gz
Algorithm Hash digest
SHA256 e2ebdf013770daaa487a1fceaf36126b258085230023338c27ab5d9034fe9fa2
MD5 944e6480445056dbd00b772cf33e9988
BLAKE2b-256 edb32bc47eeeb7bf8dca564f743e1979dc86597948aec369d9f70cfe43af9c0d

See more details on using hashes here.

File details

Details for the file biobanking-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: biobanking-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 327.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for biobanking-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 3e5a1f260ffcaf201ae9e0037fae5b917156702de7dfc27d1ef6ea09d7fff544
MD5 4a27b3762e6b6594c5326b24d8db11c5
BLAKE2b-256 1f43b72bb8d28e545d9aa52bcc4c2af102a27c68d5e39e8118a6f8e3a9c9610a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page