Skip to main content

Comprehensive Python Module for Protein Data Management: Designed for streamlined integration and processing of protein information from both UniProt and PDB. Equipped with features for concurrent data fetching, robust error handling, and database synchronization.

Project description

codecov PyPI - Version Documentation Status Linting Status

Exploration of “Metamorphism” and “Multi-functionality” in Proteins

💡 This study focuses on exploring phenomena of metamorphism and multifunctionality in proteins, fundamental aspects for understanding protein evolution and functionality across various biological contexts. We begin with a massive search for protein sequences that exhibit high percentages of identity, indicative of functional conservation across different species. Subsequently, we identify structures that, in addition to meeting this high identity criterion, exhibit significant differences in their spatial configuration, suggesting possible structural metamorphisms.

The main objective is to develop a comprehensive dataset that includes sets of varied structural conformations, providing a solid basis for comparative and evolutionary structural analysis. We implement an initial clustering strategy using the CD-HIT algorithm, which groups sequences based on their similarity. The resulting groups are then re-clustered using a structural embedding generation model based on the ID3 alphabet. This clustering focuses on finding within a homologous set of proteins those with greater structural differences, allowing a detailed analysis of their three-dimensional differences.

We have created an environment for analyzing structural distances that enables the analysis of inter-subcluster distances using leading structural alignment and distance algorithms such as Fatcat, US-Align, and CE-Align. High differences in these values are indicative of metamorphism. Concurrently, we employ similarity analysis techniques in Gene Ontology (GO) ontologies to discover proteins that, in addition to their structural conservation, exhibit multifunctionality. This includes an analysis of semantic distances per protein to identify disparate terms indicating multifunctionality.

📈 Current State of the Project

FANTASIA Redesign

🔄 FANTASIA has been completely redesigned and is now available at:
FANTASIA Repository
This new version is a pipeline for annotating GO (Gene Ontology) terms in protein sequence files (FASTAs). The redesign focuses on long-term support, updated dependencies, and improved integration with High-Performance Computing (HPC) environments.

Stable Version of the Information System

🛠️ A stable version of the information system for working with UniProt and annotation transfer is available at:
Zenodo Stable Release
This version serves as a reference implementation and provides a consistent environment for annotation transfer tasks.

Prerequisites

  • Python 3.11.6
  • RabbitMQ
  • PostgreSQL with pgVector extension installed.

Setup Instructions

1. Install Docker

Ensure Docker is installed on your system. If it’s not, you can download it from here.

2. Set Up PostgreSQL with pgvector

Run the following command to start a PostgreSQL container with the pgvector extension:

docker run -d --name pgvectorsql \
    -e POSTGRES_USER=usuario \
    -e POSTGRES_PASSWORD=clave \
    -e POSTGRES_DB=BioData \
    -p 5432:5432 \
    pgvector/pgvector:pg16

Once the container is running, connect to the database and enable the vector extension:

docker exec -it pgvectorsql psql -U usuario -d BioData -c "CREATE EXTENSION IF NOT EXISTS vector;"

3. (Optional) Connect to the Database

You can use pgAdmin 4, a graphical interface for managing and interacting with PostgreSQL databases, or any other SQL client.

4. Set Up RabbitMQ

Start a RabbitMQ container using the command below:

docker run -d --name rabbitmq \
    -p 15672:15672 \
    -p 5672:5672 \
    rabbitmq:management

5. (Optional) Manage RabbitMQ

Once RabbitMQ is running, you can access its management interface at RabbitMQ Management Interface.


Get started:

To execute the full process chain, simply run:

python main.py

This command will trigger the complete workflow, starting from the initial data preprocessing stages and continuing through to the final analysis and output generation.

Customizing the Workflow:

You can customize the sequence of tasks executed by modifying main.py or adjusting the relevant parameters in the config.yaml file. This allows you to tailor the process flow to meet specific research needs or to experiment with different methodologies.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

protein_metamorphisms_is-2.2.0.tar.gz (970.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

protein_metamorphisms_is-2.2.0-py3-none-any.whl (1.0 MB view details)

Uploaded Python 3

File details

Details for the file protein_metamorphisms_is-2.2.0.tar.gz.

File metadata

  • Download URL: protein_metamorphisms_is-2.2.0.tar.gz
  • Upload date:
  • Size: 970.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.0 CPython/3.10.16 Linux/6.5.0-1025-azure

File hashes

Hashes for protein_metamorphisms_is-2.2.0.tar.gz
Algorithm Hash digest
SHA256 199d0a82e953f7a7b2cda7ece19de103c6ec99f45997acbe6b968c5d7d4ecf0d
MD5 122db6c64eb13272b73d632fb780423c
BLAKE2b-256 449c631de66b760bffb6226818a128c59bede8bba4a67a169b90141140590eaa

See more details on using hashes here.

File details

Details for the file protein_metamorphisms_is-2.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for protein_metamorphisms_is-2.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 33048ba3f39beec0f4179e66689760250b339651feba757c4153540edad0cb27
MD5 46c95e5fcd8f9a65b1e745fbf2490925
BLAKE2b-256 7717a50e4fa9f005c5a39cf5b754eb417d930392740c99e3187b6aefe5fdbce6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page