Skip to main content

Client for the public Ginkgo AI API

Project description

Ginkgo's AI model API client

Work in progress: this repo was just made public and we are still working on integration

A python client for Ginkgo's AI model API, to run inference on public and Ginkgo-proprietary models. Learn more in the Model API announcement.

Prerequisites

Register at https://models.ginkgobioworks.ai/ to get credits and an API KEY (of the form xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx). Store the API KEY in the GINKGOAI_API_KEY environment variable.

Installation

Install the python client with pip:

pip install ginkgo-ai-client

Usage:

Note: This is an alpha version of the client and its interface may vary in the future.

Example : masked inference with Ginkgo's AA0 model

The client requires an API key (and defaults to os.environ.get("GINKGOAI_API_KEY") if none is explicitly provided)

from ginkgo_ai_client import GinkgoAIClient, MaskedInferenceQuery

client = GinkgoAIClient()
model = "ginkgo-aa0-650M"

query = MaskedInferenceQuery(sequence="MPK<mask><mask>RRL", model=model)
prediction = client.send_request(query)
# prediction.sequence == "MPKRRRRL"

It is also possible to send multiple queries at once, and even recommended in most cases as these will be processed in parallel, with appropriate scaling from our servers. The send_batch_request method returns a list of results in the same order as the queries:

sequences = ["MPK<mask><mask>RRL", "M<mask>RL", "MLLM<mask><mask>R"]
queries = [MaskedInferenceQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].sequence == "MPKRRRRL"

For large datasets (say, 100,000 queries), one can also send multiple batches of requests, then iterate over the results as they are ready. Note that the order in which the results are returned is not guaranteed to be the same as the order of the queries, therefore you should make sure the queries have a query_name attribute that will be used to identify the results.

from ginkgo_ai_client import MeanEmbeddingQuery
queries = MeanEmbeddingQuery.iter_from_fasta("sequences.fasta", model=model)
for batch_results in client.send_requests_by_batches(queries, batch_size=1000):
    for result in batch_results:
        print(result.query_name, result.embedding)

Changing the model parameter to esm2-650M or esm2-3b in this example will perform masked inference with the ESM2 model.

Example : embedding computation with Ginkgo's 3'UTR language model

from ginkgo_ai_client import GinkgoAIClient, MeanEmbeddingQuery

client = GinkgoAIClient()
model = "ginkgo-maskedlm-3utr-v1"

# SINGLE QUERY

query = MeanEmbeddingQuery(sequence="ATTGCG", model=model)
prediction = client.send_request(query)
# prediction.embedding == [1.05, -2.34, ...]

# BATCH QUERY

sequences = ["ATTGCG", "CAATGC", "GCGCACATGT"]
queries = [MeanEmbeddingQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].embedding == [1.05, -2.34, ...]

Available models

See the example folder and reference docs for more details on usage and parameters.

Model Description Reference Supported queries Versions
ESM2 Large Protein language model from Meta Github Embeddings, masked inference 3B, 650M
AA0 Ginkgo's protein language model Announcement Embeddings, masked inference 650M
3UTR Ginkgo's 3'UTR language model Preprint Embeddings, masked inference v1
Promoter-0 Ginkgo's promoter activity model Coming soon Promoter activity accross tissues v1
ABdiffusion Antibody diffusion model Coming soon Unmasking v1
LCDNA Long-context DNA diffusion model Coming soon Unmasking v1

License

This project is licensed under the MIT License. See the LICENSE file for details.

Releases

To release a new version to PyPI:

  • Make sure the changelog is up to date and the top section reads Unreleased.
  • Increment the version with the bumpversion workflow in Actions - it will update the version everywhere in the repo and create a tag.
  • If all looks good, create a release for the tag, it will automatically publish to PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ginkgo_ai_client-0.8.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ginkgo_ai_client-0.8.0-py3-none-any.whl (15.8 kB view details)

Uploaded Python 3

File details

Details for the file ginkgo_ai_client-0.8.0.tar.gz.

File metadata

  • Download URL: ginkgo_ai_client-0.8.0.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for ginkgo_ai_client-0.8.0.tar.gz
Algorithm Hash digest
SHA256 3a132fc52184a3d2d63d2376da79b6f29b5810fe88f61195d62a88b137f314e4
MD5 496cd5baeb04939d9be73881597d3e8d
BLAKE2b-256 e43af1b522461a35c8d7b3d5c6eccb414fa2b2d6fd755df0cd49b06d94d9ab5b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.8.0.tar.gz:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ginkgo_ai_client-0.8.0-py3-none-any.whl.

File metadata

File hashes

Hashes for ginkgo_ai_client-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0699e631e5fe128ee289f37628805b97bc9fb24e929e16e2a14eefb0f476b3f5
MD5 7dbc66e252aa29d189010d2c06f19f21
BLAKE2b-256 f71e83a629ea460f75b83df94be55032b81d19c9441a764be9ac90f771d3fb2e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.8.0-py3-none-any.whl:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page