No project description provided
Project description
AntiBERTy
Antibody-specific transformer language model pre-trained on 558M natural antibody sequences.
Usage
from antiberty import AntiBERTy, get_weights
antiberty = AntiBERTy.from_pretrained(get_weights())
Citing this work
@article{ruffolo2021deciphering,
title = {Deciphering antibody affinity maturation with language models and weakly supervised learning},
author = {Ruffolo, Jeffrey A and Gray, Jeffrey J and Sulam, Jeremias},
journal = {arXiv preprint arXiv:2112.07782},
year= {2021}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
antiberty-0.0.5.tar.gz
(96.6 MB
view hashes)
Built Distribution
antiberty-0.0.5-py3-none-any.whl
(96.6 MB
view hashes)
Close
Hashes for antiberty-0.0.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | efbfa6779794376c72cf0017f8f3b06fd40694335fbd6fa5463a24a965605f6e |
|
MD5 | ba1fc4f8a7fc3d3c854a0d6ed247faac |
|
BLAKE2b-256 | 90e88ecee9e8ffc6b34d32a7c2331c0eafa41d23738db8f33f1f799f751f391f |