Skip to main content

Client for the public Ginkgo AI API

Project description

Ginkgo's AI model API client

Work in progress: this repo was just made public and we are still working on integration

A python client for Ginkgo's AI model API, to run inference on public and Ginkgo-proprietary models. Learn more in the Model API announcement.

Prerequisites

Register at https://models.ginkgobioworks.ai/ to get credits and an API KEY (of the form xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx). Store the API KEY in the GINKGOAI_API_KEY environment variable.

Installation

Install the python client with pip:

pip install ginkgo-ai-client

Usage:

Note: This is an alpha version of the client and its interface may vary in the future.

Example : masked inference with Ginkgo's AA0 model

The client requires an API key (and defaults to os.environ.get("GINKGOAI_API_KEY") if none is explicitly provided)

from ginkgo_ai_client import GinkgoAIClient, MaskedInferenceQuery

client = GinkgoAIClient()
model = "ginkgo-aa0-650M"

# SINGLE QUERY

query = MaskedInferenceQuery(sequence="MPK<mask><mask>RRL", model=model)
prediction = client.send_request(query)
# prediction.sequence == "MPKRRRRL"

# BATCH QUERY

sequences = ["MPK<mask><mask>RRL", "M<mask>RL", "MLLM<mask><mask>R"]
queries = [MaskedInferenceQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].sequence == "MPKRRRRL"

Changing the model parameter to esm2-650M or esm2-3b in this example will perform masked inference with the ESM2 model.

Example : embedding computation with Ginkgo's 3'UTR language model

from ginkgo_ai_client import GinkgoAIClient, MeanEmbeddingQuery

client = GinkgoAIClient()
model = "ginkgo-maskedlm-3utr-v1"

# SINGLE QUERY

query = MeanEmbeddingQuery(sequence="ATTGCG", model=model)
prediction = client.send_request(query)
# prediction.embedding == [1.05, -2.34, ...]

# BATCH QUERY

sequences = ["ATTGCG", "CAATGC", "GCGCACATGT"]
queries = [MeanEmbeddingQuery(sequence=seq, model=model) for seq in sequences]
predictions = client.send_batch_request(queries)
# predictions[0].embedding == [1.05, -2.34, ...]

Available models

See the example folder and reference docs for more details on usage and parameters.

Model Description Reference Supported queries Versions
ESM2 Large Protein language model from Meta Github Embeddings, masked inference 3B, 650M
AA0 Ginkgo's proprietary protein language model Announcement Embeddings, masked inference 650M
3UTR Ginkgo's proprietary 3'UTR language model Preprint Embeddings, masked inference v1

License

This project is licensed under the MIT License. See the LICENSE file for details.

Releases

Make sure the changelog is up to date nd the top section reads Unreleased, increment the version with bumpversion and upload with tags:

bumpversion patch|minor|major
git push && git push --tags

This should create a release on Github and publish to PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ginkgo_ai_client-0.3.2.tar.gz (13.5 kB view details)

Uploaded Source

Built Distribution

ginkgo_ai_client-0.3.2-py3-none-any.whl (12.1 kB view details)

Uploaded Python 3

File details

Details for the file ginkgo_ai_client-0.3.2.tar.gz.

File metadata

  • Download URL: ginkgo_ai_client-0.3.2.tar.gz
  • Upload date:
  • Size: 13.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for ginkgo_ai_client-0.3.2.tar.gz
Algorithm Hash digest
SHA256 12d834155ad1ba74f232e730e0b91348c1eefff40909d45bdeb47cac14eefb60
MD5 76c5aa035e278f48a717d8e7aaf01675
BLAKE2b-256 4bb8c0d39f1da6bfe4e97b8983ed698b491630f412559899e84cd4edb7f29f03

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.3.2.tar.gz:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations:

File details

Details for the file ginkgo_ai_client-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for ginkgo_ai_client-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 54ec465f71ff0a3b43100077574dbac3b7d996dea15775fc62e829318092475a
MD5 da30e63885392c942ad987db5bfffcad
BLAKE2b-256 db68fd96e7997e572ee85e4c50e1a000fe205b4facdc20301c6a631643a05049

See more details on using hashes here.

Provenance

The following attestation bundles were made for ginkgo_ai_client-0.3.2-py3-none-any.whl:

Publisher: publish.yml on ginkgobioworks/ginkgo-ai-client

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page