Skip to main content

Python SDK for interacting with the QDX Tengu API and modules

Project description

tengu-py

Below we’ll walk through the process of building and running a drug discovery workflow, where we prepare a protein and ligand for molecular dynamics simulation, run the molecular dynamics, perform a quantum lattice interaction energy calculation.

First, install the following modules via pip - we require Python > 3.10

pip install tengu-py pdb-tools

0) Setup

This is where we prepare the tengu client, directories, and input data we’ll be working with

0.0) Imports

import os
import tarfile
from datetime import datetime
from pathlib import Path

from pdbtools import pdb_fetch, pdb_delhetatm, pdb_selchain, pdb_rplresname, pdb_keepcoord, pdb_selresname

import tengu

0.1) Credentials

# Set our token - ensure you have exported TENGU_TOKEN in your shell; or just replace the os.getenv with your token
TOKEN = os.getenv("TENGU_TOKEN")
# You might have a custom deployment url, by default it will use https://tengu.qdx.ai
URL = os.getenv("TENGU_URL") or "https://tengu.qdx.ai"
# These env variables will be read by default, so you can skip this step in future

0.2) Configuration

Lets set some global variables that define our project

# Define our project information
DESCRIPTION = "tengu-py demo notebook"
TAGS = ["qdx", "tengu-py-v2", "demo", "cdk2", "atp"]
WORK_DIR = Path.home() / "qdx" / "tengu-py-demo"
# Set our inputs
SYSTEM_PDB_PATH = WORK_DIR / "test.pdb"
PROTEIN_PDB_PATH = WORK_DIR / "test_P.pdb"
LIGAND_SMILES_STR = (
    "c1nc(c2c(n1)n(cn2)[C@H]3[C@@H]([C@@H]([C@H](O3)CO[P@@](=O)(O)O[P@](=O)(O)OP(=O)(O)O)O)O)N"
)
LIGAND_PDB_PATH = WORK_DIR / "test_L.pdb"

0.2) Build your client

# Get our client, for calling modules and using the tengu API
# Note, access_token and url are optional, if you have set the env variables TENGU_TOKEN and TENGU_URL
# Workspace sets the location where we will store our session history file and module lock file
# By using the `build_provider_with_functions` method, we will also build helper functions calling each module
client = await tengu.build_provider_with_functions(
    access_token=TOKEN, url=URL, workspace=WORK_DIR, batch_tags=TAGS
)

0.3) Input selection

# fetch datafiles
complex = list(pdb_fetch.fetch_structure("1B39"))
protein = pdb_delhetatm.remove_hetatm(pdb_selchain.select_chain(complex, "A"))
# select the ATP residue
ligand = pdb_selresname.filter_residue_by_name(complex, "ATP")
# we require ligands to be labelled as UNL
ligand = pdb_rplresname.rename_residues(ligand, "ATP", "UNL")
# we don't want to repeat all of the remark / metadata that is already in the protein
ligand = pdb_keepcoord.keep_coordinates(ligand)
# write our files to the locations defined in the config block
with open(SYSTEM_PDB_PATH, "w") as f:
    for l in complex:
        f.write(str(l))
with open(PROTEIN_PDB_PATH, "w") as f:
    for l in protein:
        f.write(str(l))
with open(LIGAND_PDB_PATH, "w") as f:
    for l in ligand:
        f.write(str(l))

0.4) View tengu modules

Tengu modules are “functions” that perform various computational chemistry tasks can be run on HPC infrastructure. We maintain multiple versions of these functions so that your scripts will stay stable over upgrades.

# Get our latest modules as a dict[module_name, module_path]
# If a lock file exists, load it so that the run is reproducable
# This will be done automatically if you use the `build_provider_with_functions` method
modules = await client.get_latest_module_paths()
module_name = "hermes_energy"
module_path = modules[module_name]
print(module_path)
github:talo/tengu-prelude/9b0c89cbd3b7541b6d700ccf66bbdf9e4c82b630#hermes_energy
  • module_name is a descriptive string and indicates the “function” the module is calling;
  • module_path is a versioned tengu “endpoint” for a module accessible via the client.

Using the same module_path string across multiple runs provides reproducibility.

0.5) Build module functions

Next, we’ll build helper functions for the modules that we’ve fetched

get_module_functions We can use the python help() function to describe their usage.

The QDX Type Description is a standard type definition across multiple programing languages to assist in interoperablility. @ indicates that the type is stored in a file, which will be synced to cloud storage

help(client.convert)
Help on function convert in module tengu.provider:

async convert(*args: [list[typing.Union[str, ~T]], <class 'pathlib.Path'>], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX: 'NIX'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>]
    Convert biomolecular and chemical file formats to the QDX file format. Supports PDB and SDF
    
    Module version: github:talo/tengu-prelude/9b0c89cbd3b7541b6d700ccf66bbdf9e4c82b630#convert
    
    QDX Type Description:
    
        format: PDB|SDF;
    
        input: @bytes 
    
    ->
    
        output: @[Conformer]
    
    
    
    :param format: the format of the input file
    :param input: the input file
    :return output: the output conformers

1) Running Tengu Modules

Below we’ll call modules using the functions created on the client.

The parameters to a tengu module function would look like the following

  • *args: The values or ids passed to the :
    1. For @Objects - A pathlib.Path or a file-like object like BufferedReader, FileIO, StringIO etc.: Loads the data in the file as an argument. NOTE: The uploaded value isn’t just the string of the file, so don’t pass the string directly; pass the path or wrap in StringIO.
    2. An tengu Provider.Argument or ArgId returned by a previous call to a tengu module via client.[some_module_name](): The ArgId type wraps data for use within tengu. It may refer to an object already uploaded to tengu storage, such as outputs of other run calls. See below for more details. It’s easier to understand when you see an example.
    3. A parameter, i.e. a value of any other type, including None: Ensure the values match what is outlined in the *args list
  • **kwargs
    • target: The machine we want to run on (eg. NIX_SSH for a cluster, GADI for a supercomputer).
    • resources: The resources to use on the target. The most commonly provided being {“gpus”: n, “storage”: i}
    • tags: Tags to associate with our run, so we can easily look up our runs. They will be populated by the batch_tags passed to the cleint on constructionby default
    • restore: If this is set to True - the function will check if a single module_instance exists for the same version of the function with the same tags, and return that instead of re-running.

The return value is a list of Provider.Arguments. You can wait for them to resolve by calling await your_argument.get(), or pass the arguments directly to subsequent functions, which will cause Tengu to do the waiting for you.

You can see the status of all the the jobs submitted for your workspace or session by going client.status()

We will now demonstrate how this works in action

1.1) Input Preparation

1.1.1) Prep the protein

First we will run the protein preparation routine (using pdbfixer internally) to prepare the protein for molecular dynamics

# we can check the arguments and outputs for prepare_protein with help()
help(client.prepare_protein)
Help on function prepare_protein in module tengu.provider:

async prepare_protein(*args: [<class 'pathlib.Path'>], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX_SSH_3: 'NIX_SSH_3'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>]
    Prepare a PDB for downstream tasks: protonate, fill missing atoms, etc.
    
    Module version: github:talo/pdb2pqr/6a0c4b2cf6d42f53d1cc889f3926b7e0ab8d1552#prepare_protein_tengu
    
    QDX Type Description:
    
        input_pdb: @bytes 
    
    ->
    
        output_pdb: @bytes
    
    
    
    :param input_pdb: An input molecule as a file: one PDB file
    :return output_pdb: An output molecule as a file: one PDB file
# Here we run the function, it will return a Provider.Arg which you can use to fetch the results
# We set restore = True so that we can restore a previous run to the same path with the same tags
(prepared_protein,) = await client.prepare_protein(PROTEIN_PDB_PATH, restore=False)
print(f"{datetime.now().time()} | Running protein prep!")
prepared_protein  # this initially only have the id of your result, we will show how to fetch the actual value later
16:35:45.092612 | Running protein prep!

Arg(id=0e934b29-5999-491e-95c2-f5d56dd58a34, value=None)

1.1.2) Checking results

# This will show the status of all of your runs
await client.status()
{'7a33e171-3843-4a29-8309-80e982ad564a': (<ModuleInstanceStatus.ADMITTED: 'ADMITTED'>,
  'prepare_protein',
  1),
 '340295cb-e874-4ef3-8742-c5e244e010f1': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_protein',
  1)}
# If any of our runs fail, we can check their logs with
for instance_id, (status, name, count) in (await client.status()).items():
    if status.value == "FAILED":
        async for log_page in client.logs(instance_id, "stderr"):
            for log in log_page:
                print(log)
# this will return the "value" of the output from the function - for files you will recieve a url that you can download,
# otherwise you will recieve them as python types
await prepared_protein.get()
{'url': 'https://storage.googleapis.com/qdx-store/cc6b99c2-4c24-4b3c-94c4-6a0bef3da01c?x-goog-signature=649e719f1e24ad5a03c61ed59f809138ca708caaa6135d957f5ff5011ce0dece0f657407280111637ae464a94d883dff91278af8b2c95bcc68ff1a66e5238b4c60dd08b824cf2968f3ede9720bdc38d21fe98f5014905ccaa9ae190679392bce692f7458d9847dbf5222fc8227d1a4ba7b59b97d0a4b28e874777186783302ca15efec0cea82e787704e69fe1230f56892b3f376dff62854b48882f47c0280af5883f75417ccbfabf93401ff1a0e7ff88d8b5ff38514382b9cc2b5415d9b871b4635b2530467872c97b25d5a4516d3a4ae1984d6afd61ee74e7a7e968aa5770b0681a9f6d6af05f5d4d7a1bc65270b997f240097566b743c7a9c4fe3c433396b&x-goog-algorithm=GOOG4-RSA-SHA256&x-goog-credential=qdx-store-user%40humming-bird-321603.iam.gserviceaccount.com%2F20231208%2Fasia-southeast1%2Fstorage%2Fgoog4_request&x-goog-date=20231208T083618Z&x-goog-expires=3600&x-goog-signedheaders=host'}
# we provide a utility to download files into your workspace,
# you can either provide a filename, which will be saved in workspace/objects/[filename],
# or you can provide your own filepath which the client will use as-is
try:
    await prepared_protein.download(filename="01_prepared_protein.pdb")
except FileExistsError:
    # we will raise an error if you try to overwrite an existing file, you can force the file to overwrite
    # by passing an absolute filepath instead
    pass
# we can read our prepared protein pdb like this
with open(client.workspace / "objects" / "01_prepared_protein.pdb", "r") as f:
    print(f.readline(), "...")
REMARK   1 PQR file generated by PDB2PQR
 ...

1.1.3) Prep the ligand

Next we will prepare the ligand (using gypsum_dl internally)

# we can check the inputs for prepare_ligand with help()
help(client.prepare_ligand)
Help on function prepare_ligand in module tengu.provider:

async prepare_ligand(*args: [<class 'str'>, <class 'pathlib.Path'>, dict[str, ~T]], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX_SSH_3: 'NIX_SSH_3'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>, <class 'pathlib.Path'>]
    Prepare ligand for sim. or quantum energy calc. using gypsum_dl
    
    Module version: github:talo/gypsum_dl/04acd1852cb3e2c8d0347e15763926fdf9a93a5d#prepare_ligand_tengu
    
    QDX Type Description:
    
        in: string;
    
        in: @bytes;
    
        in: {
    
        job_manager:string,
    
        let_tautomers_change_chirality:bool,
    
        max_ph:f32,
    
        max_variants_per_compound:i32,
    
        min_ph:f32,
    
        num_processors:i32,
    
        output_folder:string,
    
        pka_precision:f32,
    
        separate_output_files:bool,
    
        skip_adding_hydrogen:bool,
    
        skip_alternate_ring_conformations:bool,
    
        skip_enumerate_chiral_mol:bool,
    
        skip_enumerate_double_bonds:bool,
    
        skip_making_tautomers:bool,
    
        skip_optimize_geometry:bool,
    
        source:string,
    
        thoroughness:i32,
    
        use_durrant_lab_filters:bool
    
        } 
    
    ->
    
        out: @bytes;
    
        out: @bytesPrepare ligand for sim. or quantum energy calc. using gypsum_dl.
    
    
    Inputs:
        - An input molecule as a SMILES string
        - The same input molecule as a PDB file
        - A json file representing any other options to pass; see gypsum_dl docs for details
    
    
    Outputs:
        - A pdb file containing the prepared version of the molecule, ready for downstream use
ligand_prep_config = {
    "source": "",
    "output_folder": "./",
    "job_manager": "multiprocessing",
    "num_processors": -1,
    "max_variants_per_compound": 1,
    "thoroughness": 3,
    "separate_output_files": True,
    "min_ph": 6.4,
    "max_ph": 8.4,
    "pka_precision": 1.0,
    "skip_optimize_geometry": True,
    "skip_alternate_ring_conformations": True,
    "skip_adding_hydrogen": False,
    "skip_making_tautomers": True,
    "skip_enumerate_chiral_mol": True,
    "skip_enumerate_double_bonds": True,
    "let_tautomers_change_chirality": False,
    "use_durrant_lab_filters": True,
}
(prepared_ligand_pdb, prepared_ligand_sdf) = await client.prepare_ligand(
    LIGAND_SMILES_STR,
    LIGAND_PDB_PATH,
    ligand_prep_config,
    restore=True,
)
print(f"{datetime.now().time()} | Running ligand prep!")
10:23:18.129087 | Running ligand prep!
# we can check the status again
await client.status()
{'e581230b-894f-46a5-bdbb-f07c8f8ceff8': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_collate',
  1),
 '5f6a969b-3a56-48a2-afa1-25939218fc52': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'hermes_energy',
  1),
 '51d050d4-1dbf-4915-b590-9c3dcd475d14': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_gen_inputs',
  1),
 '748d3de2-8984-4c4a-bd8c-090430b8db86': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'gmx_pdb',
  1),
 '746ee040-7450-4917-b60d-e4056c9d850a': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_ligand',
  1),
 '847ba79d-c6a0-43e5-acce-28928b2f349f': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_protein',
  1)}
# we can download our outputs
try:
    await prepared_ligand_pdb.download(filename="01_prepped_ligand.pdb")
    await prepared_ligand_sdf.download(filename="01_prepped_ligand.sdf")
except FileExistsError:
    pass

print(f"{datetime.now().time()} | Downloaded prepped ligand!")
10:23:21.141026 | Downloaded prepped ligand!
# we can read our outputs
with open(client.workspace / "objects" / "01_prepped_ligand.sdf", "r") as f:
    print(f.readline(), f.readline(), "...")
untitled_0_molnum_0
      RDKit          3D
 ...

1.2) Run GROMACS (module: gmx_tengu / gmx_tengu_pdb)

Next we will run a molecular dynamics simulation on our protein and ligand, using gromacs (gmx)

help(client.gmx_pdb)
Help on function gmx_pdb in module tengu.provider:

async gmx_pdb(*args: [<class 'pathlib.Path'>, typing.Optional[~T], dict[str, ~T]], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX_SSH: 'NIX_SSH'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>, <class 'pathlib.Path'>, <class 'pathlib.Path'>, typing.Optional[~T], <class 'pathlib.Path'>]
    Runs a molecular dynamics simluation using GROMACS from protein and ligand pdbs as inputs
    
    Module version: github:talo/gmx_tengu_support/a473bc4a302eebebcb5f54a899192be75c0daa91#gmx_tengu_pdb
    
    QDX Type Description:
    
        protein: @bytes;
    
        ligand: @bytes?;
    
        gmx-config: {
    
        frame_sel:{
    
        begin_time:u32,
    
        delta_time:u32,
    
        end_time:u32
    
        }?,
    
        ligand_charge:i8?,
    
        num_gpus:u8,
    
        num_replicas:u8?,
    
        param_overrides:{
    
        em:[(string,
    
        string)],
    
        ions:[(string,
    
        string)],
    
        md:[(string,
    
        string)],
    
        npt:[(string,
    
        string)],
    
        nvt:[(string,
    
        string)]
    
        }
    
        } 
    
    ->
    
        ouput_folder: @bytes;
    
        dry_frames: @bytes;
    
        wet_frames: @bytes;
    
        lig_gro: @bytes?;
    
        xtcs: @bytes
    
    
    
    :param protein: Protein PDB file
    :param ligand: Ligand PDB file
    :param gmx-config: Configuration record
    :return ouput_folder: tar.gz Compressed full GROMACS output folder, containing dry.xtc, dry frames, wet frames, gro of ligand etc.
    :return dry_frames: tar.gz of dry pdb frames
    :return wet_frames: tar.gz of wet pdb frames
    :return lig_gro: tar.gz of ligand gro
    :return xtcs: tar.gz of xtc files
gmx_config = {
    "param_overrides": {
        "md": [("nsteps", "5000")],
        "em": [("nsteps", "1000")],
        "nvt": [("nsteps", "1000")],
        "npt": [("nsteps", "1000")],
        "ions": [],
    },
    "num_gpus": 0,
    "num_replicas": 1,
    "ligand_charge": None,
    "frame_sel": {
        "begin_time": 1,
        "end_time": 10,
        "delta_time": 2,
    },
}
# we pass the outputs from our prior runs directly, instead of their values, to prevent them from being re-uploaded
(gmx_run_folder_tar, dry_frames_tar, wet_frames_tar, lig_gro_tar, xtcs_tar) = await client.gmx_pdb(
    prepared_protein,
    prepared_ligand_pdb,
    gmx_config,
    resources={"gpus": 0, "storage": 1, "storage_units": "GB", "cpus": 48, "walltime": 60},
    restore=True,
)
print(f"{datetime.now().time()} | Running GROMACS simulation!")
10:23:21.201348 | Running GROMACS simulation!
# we can check the status again
await client.status()
{'e581230b-894f-46a5-bdbb-f07c8f8ceff8': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_collate',
  1),
 '5f6a969b-3a56-48a2-afa1-25939218fc52': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'hermes_energy',
  1),
 '51d050d4-1dbf-4915-b590-9c3dcd475d14': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_gen_inputs',
  1),
 '748d3de2-8984-4c4a-bd8c-090430b8db86': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'gmx_pdb',
  1),
 '746ee040-7450-4917-b60d-e4056c9d850a': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_ligand',
  1),
 '847ba79d-c6a0-43e5-acce-28928b2f349f': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_protein',
  1)}
print("Fetching gmx results")
try:
    await dry_frames_tar.download(filename="02_gmx_dry_frames.tar.gz")
    await lig_gro_tar.download(filename="02_gmx_lig_gro.tar.gz")
    await gmx_run_folder_tar.download(filename="02_gmx_run_folder.tar.gz")

except FileExistsError:
    pass

print(f"{datetime.now().time()} | Downloaded GROMACS output!")
Fetching gmx results
10:23:24.315210 | Downloaded GROMACS output!
# Extract the "dry" (i.e. non-solvated) pdb frames we asked for
with tarfile.open(client.workspace / "objects" / "02_gmx_dry_frames.tar.gz", "r") as tf:
    selected_frame_pdbs = [tf.extractfile(member).read() for member in tf if "pdb" in member.name]
    for i, frame in enumerate(selected_frame_pdbs):
        with open(client.workspace / "objects" / f"02_gmx_output_frame_{i}.pdb", "w") as pf:
            print(frame.decode("utf-8"), file=pf)
# Extract the ligand.gro file
with tarfile.open(client.workspace / "objects" / "02_gmx_lig_gro.tar.gz", "r") as tf:
    gro = [tf.extractfile(member).read() for member in tf if "temp" in member.name][0]
    with open(client.workspace / "objects" / f"02_gmx_lig.gro", "w") as pf:
        print(gro.decode("utf-8"), file=pf)
help(client.qp_collate)
Help on function qp_collate in module tengu.provider:

async qp_collate(*args: [<class 'pathlib.Path'>, list[~T]], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX: 'NIX'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>]
    Takes hermes results and amino acid to fragment indexes and outputs interaction energies.
    
    Module version: github:talo/tengu-prelude/b624dbe6f9ccb7ccc417d52cdc3dd251de76b604#qp_collate
    
    QDX Type Description:
    
        hermes_results: @{
    
        dimer_energies:DimerEnergies?,
    
        energy:Energy,
    
        fragment_basis_functions:[{
    
        n_occupied_basis_functions:u32,
    
        n_virtual_basis_functions:u32,
    
        total_n_basis_functions:u32
    
        }]?,
    
        full_system_basis_functions:{
    
        n_occupied_basis_functions:u32,
    
        n_virtual_basis_functions:u32,
    
        total_n_basis_functions:u32
    
        }?,
    
        monomer_energies:MonomerEnergies?,
    
        trimer_energies:TrimerEnergies?
    
        };
    
        amino_acid_to_fragment_indexes: [(string,
    
        u32)] 
    
    ->
    
        out: @{
    
        amino_acid_interaction_energies:[{
    
        amino_acid_id:string,
    
        dimer_hf_interaction_e:f64,
    
        dimer_mp2_os_interaction_e:f64?,
    
        dimer_mp2_ss_interaction_e:f64?
    
        }],
    
        amino_acid_to_fragment_indexes:[(string,
    
        u32)],
    
        dimer_hf_energies:[f64],
    
        dimer_hf_interaction_energies:[f64],
    
        dimer_mp2_os_energies:[f64?],
    
        dimer_mp2_os_interaction_energies:[f64?],
    
        dimer_mp2_ss_energies:[f64?],
    
        dimer_mp2_ss_interaction_energies:[f64?],
    
        dimer_pairs:[[u32]],
    
        full_lattice_energy:(f64,
    
        f64?,
    
        f64?)
    
        }
    
    
    
    :param hermes_results: hermes results
    :param amino_acid_to_fragment_indexes: amino acid to fragment indexes, from qp-gen-inputs

1.3) Run quantum energy calculation (modules: qp_gen_inputs, hermes_energy, qp_collate)

Tengu has “protocols” which are multiple functions wired together as a single function. In order to get our Quantum Pairwise (QP) energy results, we need to run 3 modules, which we provide as a protocol under tengu.run_qp.

We run the protocol by giving it a provider, and paths to the versions of the 3 modules it requires

(energy_result, qp_result) = await tengu.run_qp(
    client,
    modules["qp_gen_inputs"],
    modules["hermes_energy"],
    modules["qp_collate"],
    pdb=client.workspace / "objects" / f"02_gmx_output_frame_0.pdb",
    gro=client.workspace / "objects" / f"02_gmx_lig.gro",
    lig=prepared_ligand_sdf,
    lig_type="sdf",
    lig_res_id="UNL",  # The ligand's residue code in the PDB file; this is what our prep uses
    use_new_fragmentation_method=True,
    hermes_target="NIX_SSH_3",
    hermes_resources=tengu.Resources(storage=10, storage_units="MB", gpus=1, walltime=60),
    restore=True,
    tags=["with_resources", "with_new_frag"],
)
print(f"{datetime.now().time()} | Running QP energy calculation!")
launched qp_prep_instance 51d050d4-1dbf-4915-b590-9c3dcd475d14
launched hermes_instance 5f6a969b-3a56-48a2-afa1-25939218fc52
10:23:24.474485 | Running QP energy calculation!
await client.status()
{'e581230b-894f-46a5-bdbb-f07c8f8ceff8': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_collate',
  1),
 '5f6a969b-3a56-48a2-afa1-25939218fc52': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'hermes_energy',
  1),
 '51d050d4-1dbf-4915-b590-9c3dcd475d14': (<ModuleInstanceStatus.FAILED: 'FAILED'>,
  'qp_gen_inputs',
  1),
 '748d3de2-8984-4c4a-bd8c-090430b8db86': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'gmx_pdb',
  1),
 '746ee040-7450-4917-b60d-e4056c9d850a': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_ligand',
  1),
 '847ba79d-c6a0-43e5-acce-28928b2f349f': (<ModuleInstanceStatus.COMPLETED: 'COMPLETED'>,
  'prepare_protein',
  1)}
await qp_result.get()
print(f"{datetime.now().time()} | Got qp interaction energy!")

1.4) Run MM-PBSA

help(client.gmx_mmpbsa)
Help on function gmx_mmpbsa in module tengu.provider:

async gmx_mmpbsa(*args: [<class 'pathlib.Path'>, dict[str, ~T]], target: tengu.graphql_client.enums.ModuleInstanceTarget | None = <ModuleInstanceTarget.NIX_SSH_3: 'NIX_SSH_3'>, resources: tengu.graphql_client.input_types.ModuleInstanceResourcesInput | None = None, tags: list[str] | None = None, restore: bool | None = None) -> [<class 'pathlib.Path'>]
    Updates the geometry of a conformer runnning GROMACS's energy minimization in solvent
    
    Module version: github:talo/gmx_tengu_support/d4ea797dcdedf9b91de1b76a32f8a95f0cbf21df#gmx_mmpbsa_tengu
    
    QDX Type Description:
    
        output_tar_gz: @bytes;
    
        mmpbsa_config: {
    
        end_frame:u64,
    
        interaction_entropy:bool?,
    
        interval:u32?,
    
        num_cpus:u32,
    
        rerun_consolidate:bool?,
    
        start_frame:u64
    
        } 
    
    ->
    
        output: @bytes
    
    
    
    :param output_tar_gz: Compressed GROMACS output folder
    :param mmpbsa_config: Configuration record for mmpbsa:
    start_frame: Frame to start with
    end_frame: Frame to end with
    num_cpus: Number of CPUs to use - cannot be larger than the number of frames
    rerun_consolidate: Rerun consolidate if you are unsure of the validity of the dry.xtc
    interaction_entropy: Calculate interaction entropy
    
    :return output: Compressed mmpbsa output folder
mmpbsa_config = {
    "start_frame": 1,
    "end_frame": 2,
    "num_cpus": 1,  # cannot be greater than number of frames
}
(mmpbsa_result_tar,) = await client.gmx_mmpbsa(
    gmx_run_folder_tar,
    mmpbsa_config,
    resources=tengu.Resources(storage=100, storage_units="MB", gpus=0, walltime=60),
    target="GADI",
)
print(f"{datetime.now().time()} | Running GROMACS MM-PBSA calculation!")
10:25:11.820807 | Running GROMACS MM-PBSA calculation!
print("Fetching gmx_mmpbsa results")
try:
    await mmpbsa_result_tar.download(filename="04_gmx_mmpbsa_run_folder.tar.gz")
except FileExistsError:
    pass
print(f"{datetime.now().time()} | Downloaded MM-PBSA results!")
Fetching gmx_mmpbsa results

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tengu_py-1.0.1.tar.gz (42.6 kB view hashes)

Uploaded Source

Built Distribution

tengu_py-1.0.1-py3-none-any.whl (45.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page