Skip to main content

ProDock — automation utilities for molecular docking workflows.

Project description

ProDock

Automatic pipeline for molecular modeling

PyPI version conda Docker Pulls Docker Image Version License Release Last Commit CI Dependency PRs Stars

Overview

ProDock is a toolkit for building automated molecular docking workflows. It is designed for campaigns involving multiple receptors, multiple ligands, and multiple docking engines, with support for downstream pose extraction, interaction profiling, visualization, and SQLite-backed result management. Please visit Documentation for more details.

The project aims to provide one consistent workflow for:

  • receptor and ligand preparation
  • batch docking across one or more engines
  • pose extraction into standardized tables
  • interaction analysis from docked complexes
  • database storage for poses, scores, and interactions
  • reproducible downstream analysis

This makes ProDock useful both for small docking experiments and for larger benchmark-style or screening-style studies.

Main capabilities

Docking workflow

ProDock supports automated docking workflows across multiple receptor-ligand combinations and can be organized as single-target, multi-ligand, or multi-receptor campaigns.

Typical use cases include:

  • one receptor with many ligands
  • many receptors with one ligand set
  • many receptors with many ligands
  • comparison of multiple docking engines on the same campaign

Post-processing

After docking, ProDock can parse docking outputs and convert them into structured pose tables with canonical columns such as:

  • receptor_id
  • ligand_id
  • engine
  • pose_rank
  • affinity
  • mol
  • pose_id

These standardized tables make it easier to compare poses across engines and campaigns.

Interaction analysis

ProDock supports protein-ligand interaction extraction and summarization, enabling residue-level interaction profiles for each pose. This is intended for downstream comparison, ranking, and interpretation of docking results.

Database-backed storage

ProDock includes SQLite-based storage to keep docking poses and interaction records in a structured and queryable form. This is especially useful when handling many receptors, ligands, docking engines, and poses.

Database architecture

ProDock stores docking outputs and associated interaction information in a relational SQLite database. This supports scalable querying, reproducible analysis, and easy export into pandas dataframes.

Database architecture figure:

Database architecture

This architecture is intended to support:

  • pose-centric storage
  • stable pose identifiers
  • interaction lookup by pose, receptor, ligand, or engine
  • multi-receptor and multi-engine benchmarking workflows
  • clean integration with pandas- and RDKit-based analysis

Step-by-Step Installation Guide

  1. Python Installation: Ensure that Python 3.11 or later is installed on your system. You can download it from python.org.

  2. Creating a Virtual Environment (Optional but Recommended): It's recommended to use a virtual environment to avoid conflicts with other projects or system-wide packages. Use the following commands to create and activate a virtual environment:

python -m venv prodock-env
source prodock-env/bin/activate  

Or Conda

conda create --name prodock-env python=3.11
conda activate prodock-env
  1. Cloning and Installing SynTemp: Clone the SynTemp repository from GitHub and install it:
git clone https://github.com/Medicine-Artificial-Intelligence/ProDock.git
cd ProDock
pip install -r requirements.txt
pip install black flake8 pytest # black for formating, flake8 for checking format, pytest for testing

Setting Up Your Development Environment

Before you start, ensure your local development environment is set up correctly. Pull the latest version of the main branch to start with the most recent stable code.

git checkout main
git pull

Working on New Features

  1. Create a New Branch:
    For every new feature or bug fix, create a new branch from the main branch. Name your branch meaningfully, related to the feature or fix you are working on.

    git checkout -b feature/your-feature-name
    
  2. Develop and Commit Changes:
    Make your changes locally, commit them to your branch. Keep your commits small and focused; each should represent a logical unit of work.

    git commit -m "Describe the change"
    
  3. Run Quality Checks:
    Before finalizing your feature, run the following commands to ensure your code meets our formatting standards and passes all tests:

    ./lint.sh # Check code format
    pytest Test # Run tests
    

    Fix any issues or errors highlighted by these checks.

Integrating Changes

  1. Rebase onto Staging:
    Once your feature is complete and tests pass, rebase your changes onto the staging branch to prepare for integration.

    git fetch origin
    git rebase origin/staging
    

    Carefully resolve any conflicts that arise during the rebase.

  2. Push to Your Feature Branch: After successfully rebasing, push your branch to the remote repository.

    git push origin feature/your-feature-name
    
  3. Create a Pull Request: Open a pull request from your feature branch to the staging branch. Ensure the pull request description clearly describes the changes and any additional context necessary for review.

Important Notes

  • Direct Commits Prohibited: Do not push changes directly to the main or staging branches. All changes must come through pull requests reviewed by at least one other team member.
  • Merge Restrictions: The main branch can only be updated from the staging branch, not directly from feature branches.

Publication

ProDock

License

This project is licensed under MIT License - see the License file for details.

Acknowledgments

This work has received support from the Korea International Cooperation Agency (KOICA) under the project entitled “Education and Research Capacity Building Project at University of Medicine and Pharmacy at Ho Chi Minh City”, conducted from 2024 to 2025 (Project No. 2021-00020-3).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prodock-0.1.5.tar.gz (8.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prodock-0.1.5-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file prodock-0.1.5.tar.gz.

File metadata

  • Download URL: prodock-0.1.5.tar.gz
  • Upload date:
  • Size: 8.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prodock-0.1.5.tar.gz
Algorithm Hash digest
SHA256 fff4cfaad73ebacf5ca34bede772cef41e16dc8b67e96320495d038f547c7a05
MD5 f5c979e5a4b3db8496124442ecd2addd
BLAKE2b-256 db9384e0f6a4b6a25e52cdc1972611cb2e13286e9404718986c258278950c618

See more details on using hashes here.

Provenance

The following attestation bundles were made for prodock-0.1.5.tar.gz:

Publisher: publish-package.yml on Medicine-Artificial-Intelligence/ProDock

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prodock-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: prodock-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 8.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prodock-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ed3f256913a50f106e0f9055977fd8c9e1df81727001f3e4eb5df36f4cd1bb10
MD5 374c0c1e046bdaf132b77b33a04fb0ec
BLAKE2b-256 2663b0201b313e78d7bc0b2ceaa5c9db4a72c5f339330c50acdcaacd36c0e687

See more details on using hashes here.

Provenance

The following attestation bundles were made for prodock-0.1.5-py3-none-any.whl:

Publisher: publish-package.yml on Medicine-Artificial-Intelligence/ProDock

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page