Skip to main content

TAXonomic Profile Aggregation and STAndardisation

Project description

TAXPASTA

Package Latest PyPI Version Supported Python Versions DOI
Meta Project Status: Active – The project has reached a stable, usable state and is being actively developed. Apache-2.0 Code of Conduct Code Style Black
Automation GitHub Workflow Documentation Code Coverage

TAXonomic Profile Aggregation and STAndardisation

About

The main purpose of taxpasta is to standardise taxonomic profiles created by a range of bioinformatics tools. We call those tools taxonomic profilers. They each come with their own particular tabular output format. Across the profilers, relative abundances can be reported in read counts, fractions, or percentages, as well as any number of additional columns with extra information. We therefore decided to take the lessons learnt to heart and provide our own solution to deal with this pasticcio. With taxpasta you can ingest all of those formats and, at a minimum, output taxonomy identifiers and their integer counts. Taxpasta can not only standardise profiles but also merge them across samples for the same profiler into a single table.

Supported Taxonomic Profilers

Taxpasta currently supports standardisation and generation of comparable taxonomic tables for:

See supported profilers for more information.

Install

It's as simple as:

pip install taxpasta

Taxpasta is also available from the Bioconda channel

conda install -c bioconda taxpasta

and thus automatically generated Docker and Singularity BioContainers images also exist.

Optional Dependencies

Taxpasta supports a number of extras that you can install for additional features; primarily support for additional output file formats. You can install them by specifying a comma separated list within square brackets, for example,

pip install 'taxpasta[rich,biom]'
  • rich provides rich-formatted command line output and logging.
  • arrow supports writing output tables in Apache Arrow format.
  • parquet supports writing output tables in Apache Parquet format.
  • biom supports writing output tables in BIOM format.
  • ods supports writing output tables in ODS format.
  • xlsx supports writing output tables in Microsoft Excel format.
  • all includes all of the above.
  • dev provides all tools needed for contributing to taxpasta.

Usage

The main entry point for taxpasta is its command-line interface (CLI). You can interactively explore the offered commands through the help system.

taxpasta -h

Taxpasta currently offers two commands corresponding to the main use-cases. You can find out more in the commands' documentation.

Standardise

Since the supported profilers all produce their own flavour of tabular output, a quick way to normalize such files, is to standardise them with taxpasta. You need to let taxpasta know what tool the file was created by. As an example, let's standardise a MetaPhlAn profile. (You can find an example file in our test data.)

curl -O https://raw.githubusercontent.com/taxprofiler/taxpasta/main/tests/data/metaphlan/MOCK_002_Illumina_Hiseq_3000_se_metaphlan3-db.metaphlan3_profile.txt
taxpasta standardise -p metaphlan -o standardised.tsv MOCK_002_Illumina_Hiseq_3000_se_metaphlan3-db.metaphlan3_profile.txt

With these minimal arguments, taxpasta produces a two column output consisting of

taxonomy_id count

You can count on the second column being integers :wink:. Having such a simple and tidy table should make your downstream analysis much smoother to start out with. Please have a look at the full getting started tutorial for a more thorough introduction.

Merge

Converting single tables is nice, but hopefully you have many shiny samples to analyze. The taxpasta merge command works similarly to standardise except that you provide multiple profiles as input. You can grab a few more 'MOCK' examples from our test data and try it out.

LOCATION=https://raw.githubusercontent.com/taxprofiler/taxpasta/main/tests/data/metaphlan
curl -O "${LOCATION}/MOCK_001_Illumina_Hiseq_3000_se_metaphlan3-db.metaphlan3_profile.txt"
curl -O "${LOCATION}/MOCK_002_Illumina_Hiseq_3000_se_metaphlan3-db.metaphlan3_profile.txt"
curl -O "${LOCATION}/MOCK_003_Illumina_Hiseq_3000_se_metaphlan3-db.metaphlan3_profile.txt"

taxpasta merge -p metaphlan -o merged.tsv MOCK_*.metaphlan3_profile.txt

The output of the merge command has one column for the taxonomic identifier and one more column for each input profile. Again, have a look at the full getting started tutorial for a more thorough introduction.

Copyright

  • Copyright © 2022, 2023, Moritz E. Beber, Maxime Borry, James A. Fellows Yates, and Sofia Stamouli.
  • Free software distributed under the Apache Software License 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

taxpasta-0.4.0.tar.gz (39.8 kB view details)

Uploaded Source

Built Distribution

taxpasta-0.4.0-py3-none-any.whl (127.8 kB view details)

Uploaded Python 3

File details

Details for the file taxpasta-0.4.0.tar.gz.

File metadata

  • Download URL: taxpasta-0.4.0.tar.gz
  • Upload date:
  • Size: 39.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.24.1

File hashes

Hashes for taxpasta-0.4.0.tar.gz
Algorithm Hash digest
SHA256 8a0c95478845c2a2c041083620c17b99dd3b085b8677f7dae3341b5bd6d8f2bf
MD5 ea3b38f37729971ec5374419cf65a7f4
BLAKE2b-256 c9390c6c644a82a71add5677978bce52c5ff7020e0c043f4aed385842436ae9c

See more details on using hashes here.

File details

Details for the file taxpasta-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: taxpasta-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 127.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.24.1

File hashes

Hashes for taxpasta-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 edadc82038e2619c17dbd3668e75947b9c280351ca432ba4c5130e4a9119d93c
MD5 78217ea3325e04ceb212c236e5cdc40f
BLAKE2b-256 8c21cc11829275a26da45d4c96202024354d5e0a7eb4772fe390e6725169872d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page