Skip to main content

Commons utilities

Project description

image

VRS AnVIL Toolkit

Project Overview

This Python package is designed to process Variant Call Format (VCF) files and perform lookup operations on Genomic Variation Representation Service (GA4GH VRS) identifiers. The GA4GH VRS identifiers provide a standardized way to represent genomic variations, making it easier to exchange and share genomic information.

In addition, this project facilitates the retrieval of evidence associated with genomic alleles by leveraging the Genomic Data Representation and Knowledge Base (GA4GH MetaKB) service. GA4GH MetaKB provides a comprehensive knowledge base that links genomic variants to relevant clinical variant interpretations.

Features

  1. VCF File Processing:

    • Streamlines reading and parsing of VCF files, to extract relevant genomic information.
  2. GA4GH VRS Identifier Lookup:

    • Utilizes the GA4GH VRS API to perform lookups for each genomic variation mentioned in the VCF file.
    • Retrieves standardized identifiers for the alleles, enhancing interoperability with GA4GH-compliant systems.
    • GA4GH MetaKB Service Integration: Utilizes the GA4GH MetaKB retrieve evidence associated with specified genomic alleles.
  3. Output Generation:

    • Generates summary metrics about throughput, errors, evidence, and hits.
    • Presents the retrieved evidence in a structured format, providing access to information about studies, publications, and other relevant details.
  4. Additional Features

    • Provides configurable options like threading and caching for processing VCFs.
    • Implements robust error handling to address issues like invalid input files, invalid variants, and more.

Getting Started

Prerequisites

  • Python 3.10 or later
  • Internet connectivity for data dependency setup (seqrepo)

Installation

  1. Get the repository either by...

    1. Source code
    git clone https://github.com/ohsu-comp-bio/vrs_anvil_toolkit
    cd vrs_anvil_toolkit
    
    1. PyPi
    pip install vrs_anvil_toolkit
    
  2. Install dependencies either...

    1. for local use
    # install postgresql@14 (required for vrs-python)
    brew install postgresql@14
    bash scripts/setup.sh
    
    1. for use on Terra
    bash terra/setup.sh
    

Usage

General All usage has the following general steps...

  1. Create a manifest to configure your VCF processing run
  2. Use the vrs_bulk CLI to create a metrics file of related evidence
  3. Use the metrics files for downstream analysis

The follow steps are explained in detail below, with some additional info on using vrs-python to directly annotate VCFs with VRS IDs.

Manifest

The configuration of each VCF processing run run is controlled by a manifest.yaml file. Most importantly, this file specifies the...

  • input VCF file(s) to process
  • working directories
  • performance and strictness configurations

Use this commented sample manifest as a starting point on the specific variables you can specify per run.

CLI

Below are a list of command line utilities that may be useful

# activate the environment
source venv/bin/activate

# run the vrs_bulk command in the foreground
vrs_bulk annotate

# run the vrs_bulk command in parallel, one process per VCF file
vrs_bulk annotate --scatter

# run the vrs_bulk command in parallel in the background
nohup vrs_bulk annotate --scatter & # press enter to continue

# get the status of the processes for the most recent scatter run
vrs_bulk ps

The command line utility supports Google Cloud URIs and running commands in the background to interop with Terra out-of-the-box. This is described in the CLI usage above. For an example notebook, see vrs-anvil-demo.ipynb on the vrs-anvil workspace.

Processing VCF Files (vrs-python)

vrs-python is a GA4GH GKS package centered around creating Variant Representation specification (VRS) IDs: consistent, globally unique identifiers for variation. Some of its functionality includes variant ID translation and VCF annotation. Used as a dependency in vrs_bulk, it can also be used as a standalone package.

For Python usage, see vrs_vcf_annotator.py for an example.

For CLI usage:

python3 -m ga4gh.vrs.extras.vcf_annotation --vcf_in tests/fixtures/1kGP.chr1.1000.vcf --vcf_out annotated_output.vcf.gz --vrs_pickle_out allele_dicts.pkl --seqrepo_root_dir ~/seqrepo/latest

The above is an example using an example vcf. Replace the --vcf_out and vrs_pickle_out here with your desired output file path, where the output vcf can be BCF (vcf.gz) or VCF (vcf)

Also, see the VRS Annotator workflow on Dockstore for a way to do this on Terra.

Contributing

This project is open to contributions from the research community. If you are interested in contributing to the project, please contact the project team. See the contributing guide for more information on how to contribute to the project.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vrs_anvil_toolkit-0.2.0rc1.tar.gz (29.3 kB view details)

Uploaded Source

Built Distribution

vrs_anvil_toolkit-0.2.0rc1-py3-none-any.whl (30.9 kB view details)

Uploaded Python 3

File details

Details for the file vrs_anvil_toolkit-0.2.0rc1.tar.gz.

File metadata

  • Download URL: vrs_anvil_toolkit-0.2.0rc1.tar.gz
  • Upload date:
  • Size: 29.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.2

File hashes

Hashes for vrs_anvil_toolkit-0.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 0ca792ebed197b36025b5947f32399af82b7519566a670da7c74e2692eaeaa95
MD5 e3b916aa6ce9854243d8a854051d7534
BLAKE2b-256 e48fa42c0ec3419348d6d6aca7c06bd17d866de83bd2f3f5a081b1279a7032eb

See more details on using hashes here.

File details

Details for the file vrs_anvil_toolkit-0.2.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for vrs_anvil_toolkit-0.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 d113678775b26b7264e62ffdea6501d408fbc56fffae682486d4bd1b32d01480
MD5 2a59b9a605172e05a25c3078aabf8df9
BLAKE2b-256 c89bd3e84a9bbbe636f54239da403b4b48979cd9ca935054abc73307f375d797

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page