Skip to main content

A structural variant caller for long reads.

Project description

https://badge.fury.io/py/svim.svg https://img.shields.io/badge/install%20with-bioconda-brightgreen.svg

SVIM (pronounced SWIM) is a structural variant caller for long reads. It is able to detect, classify and genotype five different classes of structural variants. Unlike existing methods, SVIM integrates information from across the genome to precisely distinguish similar events, such as tandem and interspersed duplications and novel element insertions. In our experiments on simulated data and real datasets from PacBio and Nanopore sequencing machines, SVIM reached consistently better results than competing methods. Furthermore, it is unique in its capability of extracting both the genomic origin and destination of duplications.

Background on Structural Variants and Long Reads

https://raw.githubusercontent.com/eldariont/svim/master/docs/SVclasses.png

Structural variants (SVs) are typically defined as genomic variants larger than 50bps (e.g. deletions, duplications, inversions). Studies have shown that they affect more bases in any given genome than SNPs and small Indels taken together. Consequently, they have a large impact on genes and regulatory regions. This is reflected in the large number of genetic diseases that are caused by SVs.

Common sequencing technologies by providers such as Illumina generate short reads with high accuracy. However, they exhibit weaknesses in repeat and low-complexity regions. This negatively affects SV detection because SVs are associated to such regions. Single molecule long-read sequencing technologies from Pacific Biotechnologies and Oxford Nanopore produce reads with error rates of up to 15% but with lengths of several kb. The high read lengths enable them to cover entire repeats and SVs which facilitates SV detection.

Installation

#Install via conda: easiest option, installs all dependencies including read alignment dependencies
conda install --channel bioconda svim

#Install via pip (requires Python 3.6.*): installs all dependencies except those necessary for read alignment (ngmlr, minimap2, samtools)
pip3 install svim

#Install from github (requires Python 3.6.*): installs all dependencies except those necessary for read alignment (ngmlr, minimap2, samtools)
git clone https://github.com/eldariont/svim.git
cd svim
pip3 install .

Changelog

  • v1.2.0: add 3 more VCF output options: output sequence instead of symbolic alleles in VCF, output names of supporting reads, output insertion sequences of supporting reads

  • v1.1.0: outputs BNDs in VCF, detects large tandem duplications, allows skipping genotyping, makes VCF output more flexible, adds genotype scatter plot

  • v1.0.0: adds genotyping of deletions, inversions, insertions and interspersed duplications, produces plots of SV length distribution, improves help descriptions

  • v0.5.0: replaces graph-based clustering with hierarchical clustering, modifies scoring function, improves partitioning prior to clustering, improves calling from coordinate-sorted SAM/BAM files, improves VCF output

  • v0.4.4: includes exception message into log files, bug fixes, adds tests and sets up Travis

  • v0.4.3: adds support for coordinate-sorted SAM/BAM files, improves VCF output and increases compatibility with IGV and truvari, bug fixes

Input

SVIM analyzes long reads given as a FASTA/FASTQ file (uncompressed or gzipped) or a file list. Alternatively, it can analyze an alignment file in BAM format. SVIM was tested on both PacBio and Nanopore data. It works best for alignment files produced by NGMLR but also supports the faster read mapper minimap2.

Output

SVIM distinguishes five different SV classes (see above schema): deletions, inversions, tandem and interspersed duplications and novel insertions. Additionally, SVIM indicates for detected interspersed duplications whether the genomic origin location seems to be deleted in at least one haplotype (indicating a cut&paste insertion) or not (indicating a canonic interspersed duplication). For each of these SV classes, SVIM produces a BED file with the SV coordinates. Additionally, a VCF file is produced containing all found SVs.

Usage

Please see our wiki.

Contact

If you experience problems or have suggestions please create an issue or a pull request or contact heller_d@molgen.mpg.de.

Citation

Feel free to read and cite our paper in Bioinformatics: https://doi.org/10.1093/bioinformatics/btz041

License

The project is licensed under the GNU General Public License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

svim-1.2.0.tar.gz (50.9 kB view details)

Uploaded Source

Built Distribution

svim-1.2.0-py3-none-any.whl (71.5 kB view details)

Uploaded Python 3

File details

Details for the file svim-1.2.0.tar.gz.

File metadata

  • Download URL: svim-1.2.0.tar.gz
  • Upload date:
  • Size: 50.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for svim-1.2.0.tar.gz
Algorithm Hash digest
SHA256 9d921d3f027004330711b77a9304adc46e5beb4f42bb87592b46843f108fd58c
MD5 dd6e21ec2be41c0b410255b170abce9a
BLAKE2b-256 43df739fad8240c69a8138a9db6ed6e3e30cc0e56a477834c2d60475cca63808

See more details on using hashes here.

File details

Details for the file svim-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: svim-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 71.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3

File hashes

Hashes for svim-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 72d3fb62813ede169140251128dd3c20992cc86b39d80876deb0b5ab4976778b
MD5 9e61d0c7dd66e927c61e34d897344e2a
BLAKE2b-256 4fb6a2f484274f788dabd225fed85523e9dc20a850ed547b9a75310068895e02

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page