Viral metagenomics framework for short and longreads
Project description
A hecatomb is a great sacrifice or an extensive loss. Heactomb the software empowers an analyst to make data driven decisions to 'sacrifice' false-positive viral reads from metagenomes to enrich for true-positive viral reads. This process frequently results in a great loss of suspected viral sequences / contigs.
Contents
Documentation
Complete documentation is hosted at Read the Docs
Citation
Hecatomb is currently on BioRxiv!
Quick start guide
Install Hecatomb
option 1: PIP
# Optional: create a virtual with conda or venv
conda create -n hecatomb python=3.10
# activate
conda activte hecatomb
# Install
pip install hecatomb
option 2: Conda
# Create the conda env and install hecatomb in one step
conda create -n hecatomb -c conda-forge -c bioconda hecatomb
# activate
conda activate hecatomb
Check installation
hecatomb --help
Install databases and envs
Download the databases
# 8 threads = 8 downloads at a time
hecatomb install --threads 8
Optional: prebuild envs
These are automatically built when running hecatomb, but manually pre-building is useful if your cluster nodes are isolated from the internet.
hecatomb test build_envs
Run test dataset
# locally: using 32 threads and 64 GB RAM by default
hecatomb test --threads 32
# HPC: using a profile named 'slurm'
hecatomb test --profile slurm
Snakemake profiles (for running on HPCs)
Hecatomb is powered by Snakemake and greatly benefits from the use of Snakemake profiles for HPC Clusters. More information and example for setting up Snakemake profiles for Hecatomb in the documentation.
NOTE: Hecatomb currently uses Snakemake version 7. The recent version 8 for Snakemake has some breaking changes, including some changes to the command line interface for cluster execution. Any new Snakemake v8 profiles might not work with Hecatomb. Please open an issue if you need help setting up a profile.
Inputs
Parsing samples with --reads
You can pass either a directory of reads or a TSV file to --reads
.
Note that Hecatomb expects paired read file names to include common R1/R2 tags.
- Directory: Hecatomb will infer sample names and various R1/2 tag combinations from the filenames.
- TSV file: Hecatomb expects 2 or 3 columns, with column 1 being the sample name and columns 2 and 3 the reads files.
More information and examples are available here
Library preprocessing with --trim
Hecatomb uses Trimnami for read trimming which supports many different
trimming methods. Current options are fastp
(default), prinseq
, roundAB
, filtlong
(longreads),
cutadapt
(FASTA input), and notrim
(skip trimming). See Trimnami's documentation for more information.
Configuration
You can configure advanced parameters for Hecatomb.
Copy the default config: hecatomb config
.
Edit the config file in your favourite text editor: nano hecatomb.out/hecatomb.config.yaml
.
Dependencies
The only dependency you need to get up and running with Hecatomb is conda or the python package manager pip. Hecatomb relies on conda to ensure portability and ease of installation of its dependencies. All of Hecatomb's dependencies are installed during installation or runtime, so you don't have to worry about a thing!
Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hecatomb-1.3.2.tar.gz
.
File metadata
- Download URL: hecatomb-1.3.2.tar.gz
- Upload date:
- Size: 98.5 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1b1f0904ab8467d5ea6be9adfecdfd81a30b7bbd1466bad7c001b22392374fe8 |
|
MD5 | f6ac7ff904dcdeefb272a71f6f8d1600 |
|
BLAKE2b-256 | d2055afc0f3a18a2ccd8b34f6c087a276f941d0e285833c9b2a7b6d8a62d4c5a |
File details
Details for the file hecatomb-1.3.2-py3-none-any.whl
.
File metadata
- Download URL: hecatomb-1.3.2-py3-none-any.whl
- Upload date:
- Size: 98.6 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.18
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 434055db62d6ec73f3a466fa7a7a3100d0556c57e7f9272730d1423bd31b92c9 |
|
MD5 | 24c0d3b86fd977e69ab4a72bb05e8c2e |
|
BLAKE2b-256 | 35f5da7aae250963fd15d6dc9680a056045db375096be2d680f91e42352530ef |