GMICloud media provider adapters for genblaze (video, image, audio)
Project description
genblaze-gmicloud
GMICloud multi-provider video / image / audio adapters for genblaze — access Seedance, Kling, Veo, Sora, Wan, Seedream, FLUX, Gemini image, ElevenLabs, MiniMax and more through one API with SHA-256 provenance manifests.
genblaze-gmicloud wraps GMICloud's request-queue API, giving you one-call access to a large catalog of video, image, and audio models — including Kling, Veo, Sora, Wan, Seedream, FLUX-Kontext-Pro, Gemini-2.5-Flash-Image, ElevenLabs TTS, MiniMax TTS, and MiniMax Music — via three genblaze provider classes. Compose into multi-step AI pipelines, persist outputs to Backblaze B2 or any S3-compatible store, and emit a tamper-evident provenance manifest on every run.
Why genblaze-gmicloud
- One API, dozens of models — text-to-video (Seedance, Kling, Veo, Sora, Wan), text-to-image (Seedream, FLUX, Gemini, Reve), audio (ElevenLabs, MiniMax TTS/Music).
- LLM access too — standalone
chat()wrapper for Llama, DeepSeek, Qwen over GMICloud's OpenAI-compatible inference endpoint (see below). - Provenance by default — SHA-256-verified manifest with provider, model, prompt, params, cost.
- Cost tracking —
step.cost_usdis populated from GMICloud's response. - Production-ready — retries, timeouts, progress streaming, step caching.
- Durable storage — plug
genblaze-s3in for Backblaze B2 / AWS S3 / R2 / MinIO persistence.
Providers + models
| Provider class | Modality | Example models |
|---|---|---|
GMICloudVideoProvider |
video | kling-text2video-v1.6-pro, kling-image2video-v2.1-master, veo3, wan2.6-t2v, seedance-1-0-pro-250528, sora-2-pro |
GMICloudImageProvider |
image | seedream-5.0-lite, gemini-2.5-flash-image, reve-edit-fast-20251030, flux-kontext-pro |
GMICloudAudioProvider |
audio | ElevenLabs-TTS-v3, MiniMax-TTS-Speech-2.6-Turbo, MiniMax-Music-2.5 |
Registered via entry points as gmicloud, gmicloud-image, and gmicloud-audio. Any model on GMICloud's queue is supported — pass the exact model slug.
Slug casing — GMICloud's request queue is case-sensitive. Model ids are the lowercase slugs shown above. Pre-0.3 PascalCase ids (e.g.
Seedream-5.0-Lite,Veo3,Wan-2.6-I2V) still resolve viaModelSpec.deprecated_aliasesbut emit aDeprecationWarningand will be removed in 0.4 — migrate early.
Install
pip install genblaze-gmicloud
Quickstart — video (Kling)
pip install genblaze-core genblaze-gmicloud
export GMI_API_KEY="..."
from genblaze_core import Modality, Pipeline
from genblaze_gmicloud import GMICloudVideoProvider
run, manifest = (
Pipeline("gmicloud-video-demo")
.step(GMICloudVideoProvider(), model="kling-text2video-v1.6-pro",
prompt="A drone shot flying over a misty mountain valley at sunrise, cinematic",
modality=Modality.VIDEO, duration=10, aspect_ratio="16:9")
.run(timeout=600)
)
print(run.steps[0].assets[0].url, f"${run.steps[0].cost_usd:.3f}")
Quickstart — image (Seedream)
from genblaze_gmicloud import GMICloudImageProvider
run, manifest = (
Pipeline("gmicloud-image-demo")
.step(GMICloudImageProvider(), model="seedream-5.0-lite",
prompt="A photorealistic macro shot of morning dew on a spider web, soft bokeh",
modality=Modality.IMAGE, aspect_ratio="16:9")
.run(timeout=120)
)
Quickstart — audio (ElevenLabs via GMICloud)
from genblaze_gmicloud import GMICloudAudioProvider
run, manifest = (
Pipeline("gmicloud-audio-demo")
.step(GMICloudAudioProvider(), model="ElevenLabs-TTS-v3",
prompt="Welcome to Genblaze — the fastest way to build generative AI pipelines.",
modality=Modality.AUDIO)
.run(timeout=120)
)
Persist to Backblaze B2
from genblaze_core import KeyStrategy, ObjectStorageSink
from genblaze_s3 import S3StorageBackend
storage = ObjectStorageSink(
S3StorageBackend.for_backblaze("my-bucket"),
key_strategy=KeyStrategy.HIERARCHICAL,
)
# pass sink=storage to .run(…)
Backblaze B2 is the recommended default sink — cost-efficient, S3-compatible, Object Lock for immutable manifests.
LLM access — standalone chat()
For callers driving a media pipeline from an LLM — caption expansion, prompt rewriting, scene description — genblaze-gmicloud ships a chat() callable over GMICloud's OpenAI-compatible inference endpoint. It sits outside the Pipeline / Step machinery (text generation doesn't benefit from the polling / manifest / asset machinery built for media).
from genblaze_gmicloud import chat
resp = chat("deepseek-ai/DeepSeek-V3", prompt="A cinematic sunset over Tokyo")
print(resp.text, resp.tokens_out)
Any GMICloud-hosted chat model is accepted — model ids pass through to the inference endpoint verbatim, so you can use models the connector hasn't been updated for. cost_usd is always None for this connector; compute cost from tokens_in / tokens_out yourself if needed.
Full signature and ChatResponse shape: docs/features/llm-calls.md.
Credentials
Only API-key auth is supported. Set GMI_API_KEY (obtain from https://console.gmicloud.ai/) or pass api_key= to any provider ctor or to chat().
Configuring the endpoint (staging, proxies, VPC)
All three provider classes and chat() accept a base_url= ctor kwarg (or GMI_BASE_URL env var) to override the default endpoint, and an http_client= kwarg for injecting a pre-built httpx.Client — useful for shared connection pools across multi-modality pipelines or for mocking in tests.
import httpx
from genblaze_gmicloud import GMICloudVideoProvider, GMICloudImageProvider
shared = httpx.Client(
base_url="https://my-vpc-proxy.example/gmi",
headers={"Authorization": f"Bearer {key}"},
timeout=120,
)
video = GMICloudVideoProvider(http_client=shared)
image = GMICloudImageProvider(http_client=shared)
# Caller owns `shared` — providers never close externally-supplied clients.
Naming reference
GMICloud surfaces five related names; they look interchangeable but come from different namespaces:
| Surface | Value |
|---|---|
| PyPI package | genblaze-gmicloud |
| Python import | import genblaze_gmicloud |
| Provider class prefix | GMICloud* (e.g. GMICloudVideoProvider) |
| Entry-point slug | gmicloud, gmicloud-image, gmicloud-audio |
| Env vars | GMI_API_KEY, GMI_BASE_URL |
The GMI_ env prefix is short on purpose; the class / import / PyPI names use the full gmicloud for precision and to leave room for future genblaze-gmi* packages if needed.
Reading outputs safely
step.assets[0] is only valid when the step succeeded. Always check step.status first — especially in fan-out runs where one step may fail and others succeed:
for step in run.steps:
if step.status == "succeeded" and step.assets:
print(step.assets[0].url)
elif step.status == "failed":
print(f"failed ({step.error_code}): {step.error}")
Documentation
- Main repo: https://github.com/backblaze-labs/genblaze
- Examples:
gmicloud_video_pipeline.py·gmicloud_image_pipeline.py·gmicloud_audio_pipeline.py
Related packages
genblaze-core— the pipeline SDKgenblaze-s3— durable storage on Backblaze B2 and other S3-compatible backends
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file genblaze_gmicloud-0.2.5.tar.gz.
File metadata
- Download URL: genblaze_gmicloud-0.2.5.tar.gz
- Upload date:
- Size: 28.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3345fab86bfee10791041ef90f6e602bee202356579bae207f4107265fa481a7
|
|
| MD5 |
75f64f19ee1d8b22f9529ca3e855920e
|
|
| BLAKE2b-256 |
d5bf0824131ac844a7c9a0cc97e746b0e728a80a196c07bb7c0a626e6286089c
|
File details
Details for the file genblaze_gmicloud-0.2.5-py3-none-any.whl.
File metadata
- Download URL: genblaze_gmicloud-0.2.5-py3-none-any.whl
- Upload date:
- Size: 27.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e0ca58bc9d0482e0ce2d881c9319bec019801da229fcbef8d8514e388a10747a
|
|
| MD5 |
9a9dffce767d24e55529b8d5993d2460
|
|
| BLAKE2b-256 |
d15ae6dea46090525d9a32147478b5d5e64ace21fed6f0a02ee6a149fbd269a7
|