Skip to main content

Lossless evolutionary-aware multiple sequence alignment compressor

Project description

Logo

Docs · Report Bug · Request Feature



Evolution-informed lossless compression of multiple-sequence alignments (MSAs).


Installation

From PyPI (recommended for users):

# create and activate a virtual environment
python -m venv venv
source venv/bin/activate

# install ecomp
pip install ecomp

CLI Quickstart

All commands are exposed through the ecomp entry point.

# Compress an alignment (produces example.ecomp, optional JSON sidecar)
ecomp zip example.fasta --metadata example.json

# Decompress (writes FASTA by default)
ecomp unzip example.ecomp --alignment-output restored.fasta

# Inspect metadata (summary or JSON)
ecomp inspect example.ecomp --summary

# Diagnostics (Phykit-style aliases in parentheses)
ecomp consensus_sequence example.ecomp             # con_seq
ecomp column_base_counts example.ecomp             # col_counts
ecomp gap_fraction example.ecomp                   # gap_frac
ecomp shannon_entropy example.ecomp                # entropy
ecomp parsimony_informative_sites example.ecomp    # parsimony
ecomp constant_columns example.ecomp               # const_cols
ecomp pairwise_identity example.ecomp              # pid
ecomp alignment_length_excluding_gaps example.ecomp    # len_no_gaps
ecomp alignment_length example.ecomp                   # len_total
ecomp variable_sites example.ecomp                     # var_sites
ecomp percentage_identity example.ecomp                # pct_id
ecomp relative_composition_variability example.ecomp   # rcv

Benchmarks mirror standard codec comparisons:

/usr/bin/time -p ecomp zip data/fixtures/small_phylo.fasta --output out.ecomp
/usr/bin/time -p gzip  -k data/fixtures/small_phylo.fasta
/usr/bin/time -p bzip2 -k data/fixtures/small_phylo.fasta

Python API

Everything the CLI does is re-exported in ecomp.

from ecomp import ezip, eunzip, read_alignment, percentage_identity, column_base_counts

# File-based workflow
archive_path, metadata_path = ezip(
    "data/example.fasta",
    metadata_path="data/example.json",  # optional JSON copy
)
restored_path = eunzip(archive_path, output_path="data/restored.fasta")

# Diagnostics on an AlignmentFrame
frame = read_alignment("data/example.fasta")
pct_identity = percentage_identity(frame)
base_counts = column_base_counts(frame)

print(f"Mean pairwise identity: {pct_identity:.2f}%")
print("Column 1 counts:", base_counts[0])

In-memory usage (no intermediate files):

from ecomp import AlignmentFrame, compress_alignment, decompress_alignment

frame = AlignmentFrame(
    ids=["s1", "s2"],
    sequences=["ACGT", "ACGA"],
    alphabet=["A", "C", "G", "T"],
)
compressed = compress_alignment(frame)
restored = decompress_alignment(compressed.payload, compressed.metadata)
assert restored.sequences == frame.sequences

Available functions

Compression & I/Oezip, eunzip, compress_file, decompress_file, compress_alignment, decompress_alignment, read_alignment, write_alignment, alignment_from_sequences, alignment_checksum

Diagnostics & metricscolumn_base_counts, column_gap_fraction, column_shannon_entropy, parsimony_informative_columns, parsimony_informative_site_count, constant_columns, majority_rule_consensus, alignment_length, alignment_length_excluding_gaps, variable_site_count, percentage_identity, relative_composition_variability, pairwise_identity_matrix

Supporting typesAlignmentFrame, CompressedAlignment, PairwiseIdentityResult, __version__


Development

make test.fast        # unit + non-slow integration tests
make test             # full test matrix
make lint             # lint checks (ruff, black, isort)
make format           # auto-formatting
mypy ecomp            # optional type checking

Build docs locally:

make docs
open docs/_build/html/index.html

Build and publish distributions:

pip install build twine
python -m build
python -m twine check dist/*
python -m twine upload dist/*

Benchmarking eComp vs. PhyKIT

python scripts/benchmark_metrics.py data/example.ecomp \
    --operations consensus shannon_entropy variable_sites \
    --repeat 5 --warmup 1 --json results.json --csv results.csv

The script runs each metric via the ecomp CLI (on the compressed archive) and the corresponding phykit command on a decompressed alignment, then reports average and best runtimes. Add --json/--csv to emit machine-readable output.


License

eComp is released under the MIT License. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ecomp-0.1.0.tar.gz (34.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ecomp-0.1.0-py3-none-any.whl (36.6 kB view details)

Uploaded Python 3

File details

Details for the file ecomp-0.1.0.tar.gz.

File metadata

  • Download URL: ecomp-0.1.0.tar.gz
  • Upload date:
  • Size: 34.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for ecomp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c1c272ee50c375ae4a35a9700c78226a6ce4d3fdebf7666f3fb09d156034875a
MD5 a142691bdd22259984d9d7f66be8b658
BLAKE2b-256 934e169fa4eca19268b06722d6a3ce58575ed84ced8900d2adad3aea16c794d0

See more details on using hashes here.

File details

Details for the file ecomp-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: ecomp-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 36.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for ecomp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1f3870853dc14b9f71c6b487ab6fabea2d3565ecd18688468868b30289477b3b
MD5 4b17d490ee1e74937337765711e8986b
BLAKE2b-256 b80ccf48fd4b0105e090add9e9160625e80daa6222cfcc3c516caa598ce7a4ef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page