Skip to main content

AlphaGenome

Project description

AlphaGenome (wip)

Implementation of AlphaGenome, Deepmind's updated genomic attention model

Appreciation

  • Miquel Anglada-Girotto for contributing the organism, output embedding, loss functions, and all the splicing prediction heads!

Install

$ pip install alphagenome-pytorch

Usage

The main unet transformer, without any heads

import torch
from alphagenome_pytorch import AlphaGenome

model = AlphaGenome()

dna = torch.randint(0, 5, (2, 8192))

# organism_index - 0 for human, 1 for mouse - can be changed with `num_organisms` on `AlphaGenome`

embeds_1bp, embeds_128bp, embeds_pair = model(dna, organism_index = 0) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)

Adding all types of output heads (thanks to @MiqG)

import torch
from alphagenome_pytorch import AlphaGenome, publication_heads_config

model = AlphaGenome()

model.add_heads(
    'human',
    num_tracks_1bp = 10,
    num_tracks_128bp = 10,
    num_tracks_contacts = 128,
    num_splicing_contexts = 64, # 2 strands x num. CURIE conditions
)

dna = torch.randint(0, 5, (2, 8192))

organism_index = torch.tensor([0, 1]) # the organism that each sequence belongs to
splice_donor_idx = torch.tensor([[10, 100, 34], [24, 546, 870]])
splice_acceptor_idx = torch.tensor([[15, 103, 87], [56, 653, 900]])

# get sequence embeddings

embeddings_1bp, embeddings_128bp, embeddings_pair = model(dna, organism_index, return_embeds = True) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)

# get track predictions

out = model(
    dna,
    organism_index,
    splice_donor_idx = splice_donor_idx,
    splice_acceptor_idx = splice_acceptor_idx
)

for organism, outputs in out.items():
    for out_head, out_values in outputs.items():
        print(organism, out_head, out_values.shape)

# human 1bp_tracks torch.Size([2, 8192, 10])
# human 128bp_tracks torch.Size([2, 64, 10])
# human contact_head torch.Size([2, 4, 4, 128])
# human splice_probs torch.Size([2, 8192, 5])
# human splice_usage torch.Size([2, 8192, 64])
# human splice_juncs torch.Size([2, 3, 3, 64])

# initialize published AlphaGenome for human and mouse
model = AlphaGenome()
model.add_heads(**publication_heads_config['human'])
model.add_heads(**publication_heads_config['mouse'])
model.total_parameters # 259,459,534 (vs ~450 million trainable parameters)

Training

test minimal architecture

# loss quickly decreases and stabilizes at around 1349651
# this minimal model (576,444 parameters) can be run with cpu

python train_dummy.py --config_file=configs/dummy.yaml

Contributing

First install locally with the following

$ pip install '.[test]' # or uv pip install . '[test]'

Then make your changes, add a test to tests/test_alphagenome.py

$ pytest tests

That's it

Vibe coding with some attention network is totally welcomed, if it works

Citations

@article {avsec2025alphagenome,
	title = {AlphaGenome: advancing regulatory variant effect prediction with a unified DNA sequence model},
	author = {Avsec, {\v Z}iga and Latysheva, Natasha and Cheng, Jun and Novati, Guido and Taylor, Kyle R. and Ward, Tom and Bycroft, Clare and Nicolaisen, Lauren and Arvaniti, Eirini and Pan, Joshua and Thomas, Raina and Dutordoir, Vincent and Perino, Matteo and De, Soham and Karollus, Alexander and Gayoso, Adam and Sargeant, Toby and Mottram, Anne and Wong, Lai Hong and Drot{\'a}r, Pavol and Kosiorek, Adam and Senior, Andrew and Tanburn, Richard and Applebaum, Taylor and Basu, Souradeep and Hassabis, Demis and Kohli, Pushmeet},
	elocation-id = {2025.06.25.661532},
	year = {2025},
	doi = {10.1101/2025.06.25.661532},
	publisher = {Cold Spring Harbor Laboratory},
	URL = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532},
	eprint = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532.full.pdf},
	journal = {bioRxiv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alphagenome_pytorch-0.0.38.tar.gz (504.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alphagenome_pytorch-0.0.38-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file alphagenome_pytorch-0.0.38.tar.gz.

File metadata

  • Download URL: alphagenome_pytorch-0.0.38.tar.gz
  • Upload date:
  • Size: 504.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.23

File hashes

Hashes for alphagenome_pytorch-0.0.38.tar.gz
Algorithm Hash digest
SHA256 85b257df7a989d7604da0e5a7de7039b3f88ee4d8a03381de0c5905f152eb34c
MD5 27ec44828b7042248ced779cb668a9d2
BLAKE2b-256 67a76160fc0c7c5bae1ba23462cb9560cdb0049c0ab3ba43735ddd73305784c0

See more details on using hashes here.

File details

Details for the file alphagenome_pytorch-0.0.38-py3-none-any.whl.

File metadata

File hashes

Hashes for alphagenome_pytorch-0.0.38-py3-none-any.whl
Algorithm Hash digest
SHA256 982a63617e964fedb820b7751329a37d65c96460d2775418e731bbab0b621726
MD5 ce03a36c46d53c8028c3b23c122a7a55
BLAKE2b-256 d6082a6723a778ab0368a78b714e7006178b9a10bb609dd35fe0bb2aa81bf10d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page