Skip to main content

Commons utilities

Project description

image

VRS AnVIL Toolkit

Project Overview

This Python package is designed to process Variant Call Format (VCF) files and perform lookup operations on Genomic Variation Representation Service (GA4GH VRS) identifiers. The GA4GH VRS identifiers provide a standardized way to represent genomic variations, making it easier to exchange and share genomic information.

In addition, this project facilitates the retrieval of evidence associated with genomic alleles by leveraging the Genomic Data Representation and Knowledge Base (GA4GH MetaKB) service. GA4GH MetaKB provides a comprehensive knowledge base that links genomic variants to relevant clinical variant interpretations.

Features

  1. VCF File Processing:

    • Streamlines reading and parsing of VCF files, to extract relevant genomic information.
  2. GA4GH VRS Identifier Lookup:

    • Utilizes the GA4GH VRS API to perform lookups for each genomic variation mentioned in the VCF file.
    • Retrieves standardized identifiers for the alleles, enhancing interoperability with GA4GH-compliant systems.
    • GA4GH MetaKB Service Integration: Utilizes the GA4GH MetaKB retrieve evidence associated with specified genomic alleles.
  3. Output Generation:

    • Generates summary metrics about throughput, errors, evidence, and hits.
    • Presents the retrieved evidence in a structured format, providing access to information about studies, publications, and other relevant details.
    • Crate cohort allele frequency objects
  4. Additional Features

    • Provides configurable options like threading and caching for processing VCFs.
    • Implements robust error handling to address issues like invalid input files, invalid variants, and more.

Getting Started

Prerequisites

  • Python 3.10 or later
  • Internet connectivity for data dependency setup (seqrepo)

Installation

  1. Get the repository either by...

    1. Source code
    git clone https://github.com/ohsu-comp-bio/vrs_anvil_toolkit
    cd vrs_anvil_toolkit
    
    1. PyPi
    pip install vrs_anvil_toolkit
    
  2. Install dependencies either...

    1. for local use
    # install postgresql@14 (required for vrs-python)
    brew install postgresql@14
    bash scripts/setup.sh
    
    1. for use on Terra
    bash terra/setup.sh
    

Usage

General All usage has the following general steps...

  1. Create a manifest to configure your VCF processing run
  2. Use the vrs_bulk CLI to create a metrics file of related evidence
  3. Use the metrics files for downstream analysis

The follow steps are explained in detail below, with some additional info on using vrs-python to directly annotate VCFs with VRS IDs.

Manifest

The configuration of each VCF processing run run is controlled by a manifest.yaml file. Most importantly, this file specifies the...

  • input VCF file(s) to process
  • working directories
  • performance and strictness configurations

Use this commented sample manifest as a starting point on the specific variables you can specify per run.

CLI

Below are a list of command line utilities that may be useful

# activate the environment
source venv/bin/activate

# run the vrs_bulk command in the foreground
vrs_bulk annotate

# run the vrs_bulk command in parallel, one process per VCF file
vrs_bulk annotate --scatter

# run the vrs_bulk command in parallel in the background
nohup vrs_bulk annotate --scatter & # press enter to continue

# get the status of the processes for the most recent scatter run
vrs_bulk ps

The command line utility supports Google Cloud URIs and running commands in the background to interop with Terra out-of-the-box. This is described in the CLI usage above. For an example notebook, see vrs-anvil-demo.ipynb on the vrs-anvil workspace.

Cohort Allele Frequency Generation

Description

Create a cohort allele frequency object for a given variant, subsettable by participant list, GREGOR-formatted phenotypes, etc.

General Prerequisites

  • Variant ID of interest
  • VCF Path to file
  • Access to phenotypes table either through Terra (default) or as a local file (structured according to the GREGOR data model)

Use Cases

  1. Given a variant ID and VCF path, get the allele frequency for the entire cohort

    • Get VCF row corresponding to variant ID using a variant -> VCF row index
    • Get phenotypes corresponding to each participants using the phenotypes by patient table
    • Aggregate counts for participants using their genotypes
    • Create CAF object using counts
  2. Given a variant ID, VCF path, and participant list, get the allele frequency for a subset of participants (subcohort)

    • Same as 1, just subsetted on a participant list
  3. Given a variant ID, VCF path, and phenotype, get the allele frequency for the cohort conditional on the phenotype

    • Same as 1, but only increase the counts for the variant of interest if a given patient has the specified phenotype

Arguments

  • variant_id (String): variant ID of interest (VRS ID)
  • vcf_path (String): path to VCF file
  • phenotype_table (String, optional): where to pull phenotype information from. Defaults to None.
  • participant_list (List of Strings, optional): Subset of participants to use. Defaults to None.
  • phenotype (String, optional): Specific phenotype to subset on. Defaults to None.

Caveats

  • For multiple alleles, the cohort allele frequency returned is based only on the position and not on the state. In other words, all alleles are on a given variant are handled together.
  • For chromosomes with ploidy of 1 (mitochondrial calling or sex chromosomes), focus allele counts (AC) and locus allele counts (AN) can have a maximum value of 1. Focus allele counts are 1 when the genotype has at least a single allele match (0/1, 1/1, or 1) otherwise it is none.

Processing VCF Files (vrs-python)

vrs-python is a GA4GH GKS package centered around creating Variant Representation specification (VRS) IDs: consistent, globally unique identifiers for variation. Some of its functionality includes variant ID translation and VCF annotation. Used as a dependency in vrs_bulk, it can also be used as a standalone package.

For Python usage, see vrs_vcf_annotator.py for an example.

For CLI usage:

python3 -m ga4gh.vrs.extras.vcf_annotation --vcf_in tests/fixtures/1kGP.chr1.1000.vcf --vcf_out annotated_output.vcf.gz --vrs_pickle_out allele_dicts.pkl --seqrepo_root_dir ~/seqrepo/latest

The above is an example using an example vcf. Replace the --vcf_out and vrs_pickle_out here with your desired output file path, where the output vcf can be BCF (vcf.gz) or VCF (vcf)

Also, see the VRS Annotator workflow on Dockstore for a way to do this on Terra.

Contributing

This project is open to contributions from the research community. If you are interested in contributing to the project, please contact the project team. See the contributing guide for more information on how to contribute to the project.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vrs_anvil_toolkit-0.2.0rc4.tar.gz (32.0 kB view details)

Uploaded Source

Built Distribution

vrs_anvil_toolkit-0.2.0rc4-py3-none-any.whl (36.0 kB view details)

Uploaded Python 3

File details

Details for the file vrs_anvil_toolkit-0.2.0rc4.tar.gz.

File metadata

  • Download URL: vrs_anvil_toolkit-0.2.0rc4.tar.gz
  • Upload date:
  • Size: 32.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.2

File hashes

Hashes for vrs_anvil_toolkit-0.2.0rc4.tar.gz
Algorithm Hash digest
SHA256 90c48cda1091fc7d230b4b03fe10fb80608a706ab92438bd295713233728abec
MD5 16b82017149d12b5f10e76c3cd566c6f
BLAKE2b-256 4ff13b25877ba00ffe07b551b31b8314653b6a2d928d3a59a20134843fccfab6

See more details on using hashes here.

File details

Details for the file vrs_anvil_toolkit-0.2.0rc4-py3-none-any.whl.

File metadata

File hashes

Hashes for vrs_anvil_toolkit-0.2.0rc4-py3-none-any.whl
Algorithm Hash digest
SHA256 48eb0d6659e4340796e67f33220344f43b7672af26e8e23d3c0b0149ff2a85f6
MD5 fb305033b1a63fe821bddabe5786c97c
BLAKE2b-256 6045e21bce69b88f27807a531ffc4d5769c985c8cca1a788199a0ece4b332f8a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page