Skip to main content

Phynteny: Synteny-based prediction of bacteriophage genes

Project description

Edwards Lab License: MIT GitHub language count install with pip GitHub last commit (branch) PyPI version Anaconda-Server Badge Conda

Phynteny-Transformer

Phynteny Transformer Logo

Phynteny is annotation tool for bacteriophage genomes that integrates protein language models and gene synteny. phynteny-transformer leverages a transformer architecture with attention mechanisms and long short term memory to capture the positional information of genes.

phynteny-transformer takes a genbank file with PHROG annotations as input. If you haven't already annotated your phage(s) with Pharokka and Phold go do that and then come right back here!

Dependencies

To run phynteny-transformer, you need the following dependencies:

  • Python 3.9+
  • torch
  • numpy
  • pandas
  • click
  • loguru
  • BioPython
  • transformers
  • importlib_resources
  • scikit-learn
  • tqdm

You can install the dependencies using pip:

Installation

You can install Phynteny from bioconda at https://anaconda.org/bioconda/phynteny. Make sure you have conda installed.

# create conda environment and install phynteny
conda create -n phynteny_transformer -c bioconda phynteny_transformer
 
# activate environment
conda activate phynteny_transformer

# install phynteny
conda install -c bioconda phynteny_transformer

Now you can go to Install Models to install pre-trained phynteny-transformer models.

Option 2: Installing Phynteny using pip

You can install Phynteny from PyPI at https://pypi.org/project/phynteny/.

pip install phynteny_transformer

Now you can go to Install Models to install pre-trained phynteny models.

Option 3: Installing Phynteny from source

You can install Phynteny Transformer from source.

git clone https://github.com/susiegriggo/Phynteny_transformer 
cd Phynteny_transformer 
pip install . 

NOTE: Source installation is recommended if you would like to train your own phynteny-trasformer models.

Install Models

Before you can run phynteny-transformer you'll need to install some databases

install_models

If you would like to install them to a specific location

install_models -o <path/to/database_dir>

If this doesn't work you can download the models directly from Zenodo and untar them yourself and point Phynteny to them with the -m flag.

Quick Start

phynteny_transformer  test_data/test_phage.gbk -o test_output

Output

  • phynteny_transformer.gbk contains a GenBank format file that has been updated to include annotations generated using Phynteny along with their Phynteny score and confidence.
  • phynteny_per_cds_funcions.tsv provides a table of the annotations generated (similar to the pharokka_cds_functions.tsv from Pharokka)

Brief Overview

Brief Overview

Advanced Usage

Phynteny Transformer provides an advanced mode for specifying the parameters of a model that you trained yourself. To see all advanced options:

phynteny_transformer --help --advanced

Training Custom Models

Phynteny Transformer allows you to train your own custom models. To train a model, you need to provide a dataset in the required format and specify the training parameters. For more details, refer to the documentation in the train_transformer directory.

Bugs and Suggestions

If you break Phynteny or would like to make any suggestions please open an issue or email me at susie.grigson@gmail.com and I'll try to get back to you.

Acknowledgements

Thankyou to Laura Inglis for designing the Phynteny logo!

Phynteny was trained using resources provided by the Pawsey Supercomputing Research Centre (Perth, Australia) which is funded by the Australian Government. Analysis was performed using the Flinders University DeepThought High Performance Cluster (https://doi.org/10. 25957/FLINDERS.HPC.DEEPTHOUGHT).

Wow! how can I cite this?

Preprint for Phynteny is available here.
You can cite Phynteny as:
Grigson, S.R., Bouras, G., Papudeshi, B., Mallawaarachchi, V., Roach, M.J., Decewicz, P., & Edwards, R.A. (2025). Synteny-aware functional annotation of bacteriophage genomes with Phynteny. bioRxiv, 2025.07.28.667340. https://doi.org/10.1101/2025.07.28.667340.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

phynteny_transformer-0.1.3.tar.gz (1.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

phynteny_transformer-0.1.3-py3-none-any.whl (1.0 MB view details)

Uploaded Python 3

File details

Details for the file phynteny_transformer-0.1.3.tar.gz.

File metadata

  • Download URL: phynteny_transformer-0.1.3.tar.gz
  • Upload date:
  • Size: 1.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for phynteny_transformer-0.1.3.tar.gz
Algorithm Hash digest
SHA256 8ed09eeb7e7e953f5bc4f49fea45175b8c03ac5b06e8efcb146019b7bb37630c
MD5 ffe6b97032e21e790eb62aba53f34fd9
BLAKE2b-256 5affe996a2c0837b5d542c03ac332fbece6fea46b46bc97b12c1617420e60a11

See more details on using hashes here.

Provenance

The following attestation bundles were made for phynteny_transformer-0.1.3.tar.gz:

Publisher: python-publish.yml on susiegriggo/Phynteny_transformer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file phynteny_transformer-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for phynteny_transformer-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0bd8dcce815783a3c973f26dbac5f1b93a52bb77ae95e9bcebbb102983903291
MD5 3559214696d351226cf7c4d8d3610b36
BLAKE2b-256 996dffc3e3c6f3f6640c6603c35228f34a14d1dc4c09f5dc8f916c6782c5848a

See more details on using hashes here.

Provenance

The following attestation bundles were made for phynteny_transformer-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on susiegriggo/Phynteny_transformer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page