Alphafold 3 - Pytorch
Project description
Alphafold 3 - Pytorch (wip)
Implementation of Alphafold 3 in Pytorch
Getting a fair number of emails. You can chat with me about this work here
Appreciation
- Joseph for contributing the relative positional encoding module!
Install
$ pip install alphafold3-pytorch
Usage
import torch
from alphafold3_pytorch import Alphafold3
alphafold3 = Alphafold3(
dim_atom_inputs = 77,
dim_additional_residue_feats = 33,
dim_template_feats = 44
)
# mock inputs
seq_len = 16
atom_seq_len = seq_len * 27
atom_inputs = torch.randn(2, atom_seq_len, 77)
atom_mask = torch.ones((2, atom_seq_len)).bool()
atompair_feats = torch.randn(2, atom_seq_len, atom_seq_len, 16)
additional_residue_feats = torch.randn(2, seq_len, 33)
template_feats = torch.randn(2, 2, seq_len, seq_len, 44)
template_mask = torch.ones((2, 2)).bool()
msa = torch.randn(2, 7, seq_len, 64)
# required for training, but omitted on inference
atom_pos = torch.randn(2, atom_seq_len, 3)
distance_labels = torch.randint(0, 37, (2, seq_len, seq_len))
# train
loss = alphafold3(
num_recycling_steps = 2,
atom_inputs = atom_inputs,
atom_mask = atom_mask,
atompair_feats = atompair_feats,
additional_residue_feats = additional_residue_feats,
msa = msa,
templates = template_feats,
template_mask = template_mask,
atom_pos = atom_pos,
distance_labels = distance_labels
)
loss.backward()
# after much training ...
sampled_atom_pos = alphafold3(
num_recycling_steps = 4,
num_sample_steps = 16,
atom_inputs = atom_inputs,
atom_mask = atom_mask,
atompair_feats = atompair_feats,
additional_residue_feats = additional_residue_feats,
msa = msa,
templates = template_feats,
template_mask = template_mask
)
sampled_atom_pos.shape # (2, 16 * 27, 3)
Citations
@article{Abramson2024-fj,
title = "Accurate structure prediction of biomolecular interactions with
{AlphaFold} 3",
author = "Abramson, Josh and Adler, Jonas and Dunger, Jack and Evans,
Richard and Green, Tim and Pritzel, Alexander and Ronneberger,
Olaf and Willmore, Lindsay and Ballard, Andrew J and Bambrick,
Joshua and Bodenstein, Sebastian W and Evans, David A and Hung,
Chia-Chun and O'Neill, Michael and Reiman, David and
Tunyasuvunakool, Kathryn and Wu, Zachary and {\v Z}emgulyt{\.e},
Akvil{\.e} and Arvaniti, Eirini and Beattie, Charles and
Bertolli, Ottavia and Bridgland, Alex and Cherepanov, Alexey and
Congreve, Miles and Cowen-Rivers, Alexander I and Cowie, Andrew
and Figurnov, Michael and Fuchs, Fabian B and Gladman, Hannah and
Jain, Rishub and Khan, Yousuf A and Low, Caroline M R and Perlin,
Kuba and Potapenko, Anna and Savy, Pascal and Singh, Sukhdeep and
Stecula, Adrian and Thillaisundaram, Ashok and Tong, Catherine
and Yakneen, Sergei and Zhong, Ellen D and Zielinski, Michal and
{\v Z}{\'\i}dek, Augustin and Bapst, Victor and Kohli, Pushmeet
and Jaderberg, Max and Hassabis, Demis and Jumper, John M",
journal = "Nature",
month = "May",
year = 2024
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
alphafold3_pytorch-0.0.3.tar.gz
(812.1 kB
view hashes)
Built Distribution
Close
Hashes for alphafold3_pytorch-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8dcf13abd477dadc5168853edc090b454a27a00b01464ff6e222cad82527a96a |
|
MD5 | 6c34434c0b6277b29135464c9eddcf90 |
|
BLAKE2b-256 | ec1f34a9a746463ad5ba58e0f80b8f52d77baa050173d8e632714c8b198df3dd |