Client for the public Ginkgo AI API
Project description
Ginkgo's AI model API client
Work in progress: this repo was just made public and we are still working on integration
A python client for Ginkgo's AI model API, to run inference on public and Ginkgo-proprietary models. Learn more in the Model API announcement.
Prerequisites
Register at https://models.ginkgobioworks.ai/ to get credits and an API KEY (of the form xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx
).
Store the API KEY in the GINKGOAI_API_KEY
environment variable.
Installation
Install the python client with pip:
pip install ginkgo-ai-client
Usage:
Note: This is an alpha version of the client and its interface may vary in the future.
Example : masked inference with Ginkgo's AA0 model
The client requires an API key (and defaults to os.environ.get("GINKGOAI_API_KEY")
if none is explicitly provided)
from ginkgo_ai_client import GinkgoAIClient, MaskedInferenceQuery
client = GinkgoAIClient()
model = "ginkgo-aa0-650M"
# SINGLE QUERY
query = MaskedInferenceQuery(sequence="MPK<mask><mask>RRL", model=model)
prediction = client.send_request(query)
# prediction.sequence == "MPKRRRRL"
# BATCH QUERY
sequences = ["MPK<mask><mask>RRL", "M<mask>RL", "MLLM<mask><mask>R"]
queries = [MaskedInferenceQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].sequence == "MPKRRRRL"
Changing the model
parameter to esm2-650M
or esm2-3b
in this example will perform
masked inference with the ESM2 model.
Example : embedding computation with Ginkgo's 3'UTR language model
from ginkgo_ai_client import GinkgoAIClient, MeanEmbeddingQuery
client = GinkgoAIClient()
model = "ginkgo-maskedlm-3utr-v1"
# SINGLE QUERY
query = MeanEmbeddingQuery(sequence="ATTGCG", model=model)
prediction = client.send_request(query)
# prediction.embedding == [1.05, -2.34, ...]
# BATCH QUERY
sequences = ["ATTGCG", "CAATGC", "GCGCACATGT"]
queries = [MeanEmbeddingQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].embedding == [1.05, -2.34, ...]
Available models
See the example folder and reference docs for more details on usage and parameters.
Model | Description | Reference | Supported queries | Versions |
---|---|---|---|---|
ESM2 | Large Protein language model from Meta | Github | Embeddings, masked inference | 3B, 650M |
AA0 | Ginkgo's proprietary protein language model | Announcement | Embeddings, masked inference | 650M |
3UTR | Ginkgo's proprietary 3'UTR language model | Preprint | Embeddings, masked inference | v1 |
License
This project is licensed under the MIT License. See the LICENSE
file for details.
Releases
Make sure the changelog is up to date nd the top section reads Unreleased
, increment the version with bumpversion
and upload with tags:
bumpversion patch|minor|major
git push && git push --tags
This should create a release on Github and publish to PyPI.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ginkgo_ai_client-0.3.1.tar.gz
.
File metadata
- Download URL: ginkgo_ai_client-0.3.1.tar.gz
- Upload date:
- Size: 13.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d0dbaab4fcd49857d3b6370ff2b46360e0e111f1d86099af9c86362a77de7e80 |
|
MD5 | 2c938e8dac964e79034a6ae6524ec560 |
|
BLAKE2b-256 | e3020adcde4f28a59fcb668120506258c988686089f8baf2a88ef16b90daf075 |
Provenance
The following attestation bundles were made for ginkgo_ai_client-0.3.1.tar.gz
:
Publisher:
publish.yml
on ginkgobioworks/ginkgo-ai-client
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
ginkgo_ai_client-0.3.1.tar.gz
- Subject digest:
d0dbaab4fcd49857d3b6370ff2b46360e0e111f1d86099af9c86362a77de7e80
- Sigstore transparency entry: 149595280
- Sigstore integration time:
- Predicate type:
File details
Details for the file ginkgo_ai_client-0.3.1-py3-none-any.whl
.
File metadata
- Download URL: ginkgo_ai_client-0.3.1-py3-none-any.whl
- Upload date:
- Size: 12.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 964ece647f757242e469a6196661d56a49e9128facb45717e4ebe7de50339153 |
|
MD5 | a046fcbabdec93bfe1a8cb376a02e74d |
|
BLAKE2b-256 | 3a2a86beb3353ae19ee18183c1e02cecd4e3084d83be768f92721dfe4af055ea |
Provenance
The following attestation bundles were made for ginkgo_ai_client-0.3.1-py3-none-any.whl
:
Publisher:
publish.yml
on ginkgobioworks/ginkgo-ai-client
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
ginkgo_ai_client-0.3.1-py3-none-any.whl
- Subject digest:
964ece647f757242e469a6196661d56a49e9128facb45717e4ebe7de50339153
- Sigstore transparency entry: 149595281
- Sigstore integration time:
- Predicate type: