Skip to main content

A Snakemake-based pipeline for amplicon processing

Project description

eDentity-metabarcoding-pipeline

Overview

eDentity is a Snakemake based metabarcoding workflow designed for Illumina/AVITI paired-end data. It automates Vsearch commands to denoise paired-end Fastq sequences and generate Exact Sequence Variants (ESVs). The pipeline is inspired by APSCALE; please cite them if you use this pipeline.

Installation

Copy the dependencies below into a file (e.g., edentity-env.yaml), then create and activate the environment with:

priority: strict
name: edentity-env
channels:
    - conda-forge
    - bioconda
    - nodefaults
dependencies:
    - snakemake
    - pip
    - cutadapt=4.9
    - biopython=1.84
    - fastp=0.24.0
    - multiqc=1.27.1
    - vsearch=2.28.1
    - pip:
        - edentity

Install:

conda env create -f edentity-env.yaml --name edentity-env  && conda activate edentity-env

Usage

After installation, the pipeline can be run from the command line. Parameters can be provided either directly via command line arguments or through a configuration file.

Using Command Line Arguments

Replace the example parameters with those specific to your project:

edentity --raw_data_dir /path/to/your/raw_fastq_files/ \
--work_dir /path/to/your/work_directory \
--forward_primer pcr primer sequence \
--reverse_primer pcr primer sequence \
--min_length 200 \
--max_length 600

Using a Configuration File

Create a params_config.yaml file and copy the YAML template below into it. Adjust the parameters to your project specifications:

# project specific
raw_data_dir: "/path/to/your/raw_fastq_files/"
work_dir: "path/to/your/work_directory"
make_json_reports: False
dataType: "Illumina" # [Illumina, AVITI], one of the two
cpu_cores: 20 

# general quality control (Fastp)
average_qual: 25
length_required: 100
n_base_limit: 0

# PE_merging (these are set to vsearch default values)
maxdiffpct: 100
maxdiffs: 10
minovlen: 10

# primer_trimming (cutadapt)
forward_primer:   
reverse_primer: 
anchoring: False
discard_untrimmed: True

# quality_filtering (vsearch)
min_length: 100
max_length: 600
maxEE: 1

# dereplication (vsearch)
fasta_width: 0

# denoising (vsearch)
alpha: 2
minsize: 4

Then run the pipeline with:

edentity --config_file params_config.yaml

Parameters:

  • --forward_primer: Forward primer sequence.
  • --reverse_primer: Reverse primer sequence.
  • --raw_data_dir: Directory containing your raw sequencing data.
  • --work_dir: Directory for pipeline outputs and intermediate files.
  • --make_json_reports: Set true to create extended json reports

Configuring Snakemake Parameters via Profile

You can control Snakemake-specific parameters (such as cluster execution, resource limits, and rerun-incomplete ...) using a profile YAML configuration. This is useful for running the pipeline on HPC clusters or customizing workflow execution.

Create a snakemake-profile.yaml file with content like:

executor: local # clusters e.g slurm, lsf, aws-batch ... see snakemake documentation 
jobs: "30"
max-jobs-per-second: "10"
max-status-checks-per-second: "10"
local-cores: 44
latency-wait: "30"
printshellcmds: "True"
rerun-incomplete: "False"
keep-incomplete: "True"
conda-cleanup-envs: "False"
dryrun: true
resources:
    mem_mb: 16000
    threads: 8
  • executor: Cluster scheduler (e.g., SLURM).
  • jobs: Maximum number of parallel jobs.
  • resources: Default resource limits for jobs.
  • dryrun: Set to true to perform a dry-run (no jobs will be executed).

For more details on these and other Snakemake parameters, see the Snakemake documentation.

To use this profile, run:

edentity --profile snakemake-profile.yaml --config_file params_config.yaml

You can combine this with your pipeline configuration file for full control over both workflow and execution parameters.

For a full list of options params:

edentity --help

Pipeline Output Directory Structure

After successful execution, the pipeline generates a structured set of output directories and files within your specified work_dir. All file names are prefixed with your work_dir. The main components are:

work_dir/
│   ├── Results/
│   │   ├── ESVs_fasta/                  # Directory containing FASTA file of ESVs
│   │   └── reports/                     # Reports generated by the pipeline
│   │       ├── ESV_table.tsv            # Table of Exact Sequence Variants (ESVs)
│   │       ├── summary_report.tsv       # Summary statistics for the run
│   │       ├── metabarcoding_run.json     # JSON report with run metadata and parameters
│   │       └── multiqc_report/            # Directory containing MultiQC output
│   │           └── multiqc.html           # Interactive MultiQC report
├── logs/                          # log files for each step of the pipeline
├── edentity_pipeline_settings/    # Stores configuration files used for the pipeline run

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

edentity-1.4.8.tar.gz (72.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

edentity-1.4.8-py3-none-any.whl (39.6 kB view details)

Uploaded Python 3

File details

Details for the file edentity-1.4.8.tar.gz.

File metadata

  • Download URL: edentity-1.4.8.tar.gz
  • Upload date:
  • Size: 72.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.18

File hashes

Hashes for edentity-1.4.8.tar.gz
Algorithm Hash digest
SHA256 deb4052f31bb3837cb95286d62c896526a0e1e02346d058b4b194fe520bc3673
MD5 4ce7c06c6fc8ba49fe13c830b8d53ce2
BLAKE2b-256 f1ae6bdf5d94014b220c0ad8fa614753696284c0911d9c1ba94d8e3706085227

See more details on using hashes here.

File details

Details for the file edentity-1.4.8-py3-none-any.whl.

File metadata

  • Download URL: edentity-1.4.8-py3-none-any.whl
  • Upload date:
  • Size: 39.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.18

File hashes

Hashes for edentity-1.4.8-py3-none-any.whl
Algorithm Hash digest
SHA256 89352fe667c0468e18793a2b78f2f087cfe0fa5e8a008a07f5f3de213cdb81fe
MD5 53e052917440d82dff883b8b019afbf5
BLAKE2b-256 76c254a7c425910c577aa76513b00f40d10b0623c82e9e25c61fc609b6c88811

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page