No project description provided
Project description
AntiBERTy
Antibody-specific transformer language model pre-trained on 558M natural antibody sequences.
Usage
from antiberty import AntiBERTy, get_weights
antiberty = AntiBERTy.from_pretrained(get_weights())
Citing this work
@article{ruffolo2021deciphering,
title = {Deciphering antibody affinity maturation with language models and weakly supervised learning},
author = {Ruffolo, Jeffrey A and Gray, Jeffrey J and Sulam, Jeremias},
journal = {arXiv preprint arXiv:2112.07782},
year= {2021}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
antiberty-0.0.6.tar.gz
(96.6 MB
view hashes)
Built Distribution
antiberty-0.0.6-py3-none-any.whl
(96.6 MB
view hashes)
Close
Hashes for antiberty-0.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4bd0408d45e886272c96eb029e2f2faa0e7bea7af2c13e087c72e370161bab5a |
|
MD5 | f79506847ba40d35c3b049b71a02d5dc |
|
BLAKE2b-256 | 39d672d08493355aba346a132def7fe02030ff7a9339a9117b6ddfac03bf1988 |