Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface
Project description
Celeste AI
The primitive layer for multi-modal AI
All modalities. All providers. One interface.
Primitives, not frameworks.
🚀 This is the v1 Beta release. We're validating the new architecture before the stable v1.0 release. Feedback welcome!
Celeste AI
Type-safe, modality/provider-agnostic primitives.
- Unified Interface: One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
- True Multi-Modal: Text, Image, Audio, Video, Embeddings, Search —all first-class citizens.
- Type-Safe by Design: Full Pydantic validation and IDE autocomplete.
- Zero Lock-In: Switch providers instantly by changing a single config string.
- Primitives, Not Frameworks: No agents, no chains, no magic. Just clean I/O.
- Lightweight Architecture: No vendor SDKs. Pure, fast HTTP.
🚀 Quick Start
import celeste
# "We need a catchy slogan for our new eco-friendly sneaker."
slogan = await celeste.text.generate(
"Write a slogan for an eco-friendly sneaker.",
model="gpt-5",
)
print(slogan.content)
🎨 Multimodal example
import celeste
from pydantic import BaseModel, Field
class ProductCampaign(BaseModel):
visual_prompt: str
audio_script: str
# 2. Extract Campaign Assets (Anthropic)
# -----------------------------------------------------
campaign_output = await celeste.text.generate(
f"Create campaign assets for slogan: {slogan.content}",
model="claude-opus-4-1",
output_schema=ProductCampaign,
)
campaign = campaign_output.content
# 3. Generate Ad Visual (Flux)
# -----------------------------------------------------
image_output = await celeste.images.generate(
campaign.visual_prompt,
model="flux-2-flex",
aspect_ratio="1:1"
)
image = image_output.content
# 4. Generate Radio Spot (ElevenLabs)
# -----------------------------------------------------
speech_output = await celeste.audio.speak(
campaign.audio_script,
model="eleven_v3",
voice="adam"
)
speech = speech_output.content
No special cases. No separate libraries. One consistent interface.
🔄 Switch providers in one line
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
# ❌ Anthropic Way
from anthropic import Anthropic
import json
client = Anthropic()
response = client.messages.create(
model=anthropic_model_id,
messages=[
{"role": "user",
"content": "Extract user info: John is 30"}
],
output_format={
"type": "json_schema",
"schema": User.model_json_schema()
}
)
user_data = json.loads(response.content[0].text)
# ❌ Google Gemini Way
from google import genai
from google.genai import types
client = genai.Client()
response = await client.aio.models.generate_content(
model=gemini_model_id,
contents="Extract user info: John is 30",
config=types.GenerateContentConfig(
response_mime_type="application/json",
response_schema=User
)
)
user = response.parsed
# ✅ Celeste Way
import celeste
response = await celeste.text.generate(
"Extract user info: John is 30",
model=google_model_id, # <--- Choose any model from any provider
output_schema=User, # <--- Unified parameter working across all providers
)
user = response.content # Already parsed as User instance
🧭 Namespace API (recommended)
Namespaces are domain-first: start from the resource you want to work with (e.g., videos) even if the input is text. Under the hood, Celeste maps (domain, operation) to the output modality (e.g., celeste.images.analyze(...) routes to the text modality because analysis returns text).
import celeste
# Async (default)
result = await celeste.images.analyze(
image=img,
prompt="Describe this image",
model="gpt-4o"
)
# Sync
result = celeste.images.sync.analyze(
image=img,
prompt="Describe this image",
model="gpt-4o"
)
# Async streaming
async for chunk in celeste.text.stream.generate("Hello", model="gpt-4o"):
print(chunk.content, end="")
# Sync streaming
for chunk in celeste.text.sync.stream.generate("Hello", model="gpt-4o"):
print(chunk.content, end="")
⚙️ Advanced: create_client
For explicit configuration or client reuse, use create_client with modality + operation. This is modality-first: you choose the output type and operation explicitly.
from celeste import create_client, Modality, Operation
client = create_client(
modality=Modality.TEXT,
operation=Operation.GENERATE,
model=google_model_id,
)
response = await client.generate("Extract user info: John is 30", output_schema=User)
capabilityis still supported but deprecated. Prefermodality+operation.
🪶 Install
pip install celeste-ai
# or
uv add celeste-ai
🔁 Behavior changes since v0.3.9
- Capabilities → modalities + operations.
- Namespace API is now the default entry point.
create_clientnow usesmodality+operation;capabilityis deprecated.analyzefor image/audio/video routes through the text modality.- Namespaces are domain-first (resource you work with);
create_clientis modality-first (output type). Domain + operation maps to modality. extra_bodyallows provider-specific parameters without first-class mapping.- Single-package install (no extras).
🔧 Type-Safe by Design
# Full IDE autocomplete
import celeste
response = await celeste.text.generate(
"Explain AI",
model="gpt-4o-mini",
temperature=0.7, # ✅ Validated (0.0-2.0)
max_tokens=100, # ✅ Validated (int)
)
# Typed response
print(response.content) # str (IDE knows the type)
print(response.usage.input_tokens) # int
print(response.metadata["model"]) # str
Catch errors before production.
🤝 Contributing
We welcome contributions! See CONTRIBUTING.md.
Request a provider: GitHub Issues Report bugs: GitHub Issues
📄 License
MIT license – see LICENSE for details.
Get Started • Documentation • GitHub
Made with ❤️ by developers tired of framework lock-in
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file celeste_ai-0.9.0.tar.gz.
File metadata
- Download URL: celeste_ai-0.9.0.tar.gz
- Upload date:
- Size: 186.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ed6506bd4d835831e59432807556654b1509fd66effe5b9eb91e3ab19eabe3af
|
|
| MD5 |
20ab1c78df7ae1c5f39e12071bd15d52
|
|
| BLAKE2b-256 |
715ad37117b2630aef1f6feaaf441c48f6aba439a368f2c378ad284fca874f83
|
Provenance
The following attestation bundles were made for celeste_ai-0.9.0.tar.gz:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.9.0.tar.gz -
Subject digest:
ed6506bd4d835831e59432807556654b1509fd66effe5b9eb91e3ab19eabe3af - Sigstore transparency entry: 829218880
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@b84927627a2593814654f6021d17d8daad97212d -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b84927627a2593814654f6021d17d8daad97212d -
Trigger Event:
push
-
Statement type:
File details
Details for the file celeste_ai-0.9.0-py3-none-any.whl.
File metadata
- Download URL: celeste_ai-0.9.0-py3-none-any.whl
- Upload date:
- Size: 247.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f3219fd8ca35cee2575a4500302396017fd70d4d6b2f2fd162a9d8d7e463459a
|
|
| MD5 |
a65e57047d8d17f5abf6f4508efdba22
|
|
| BLAKE2b-256 |
e6691869e2a9c50162d4df2d9cfc257ad557e7daa6496cb4518f7a8a86686cd4
|
Provenance
The following attestation bundles were made for celeste_ai-0.9.0-py3-none-any.whl:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.9.0-py3-none-any.whl -
Subject digest:
f3219fd8ca35cee2575a4500302396017fd70d4d6b2f2fd162a9d8d7e463459a - Sigstore transparency entry: 829218886
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@b84927627a2593814654f6021d17d8daad97212d -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b84927627a2593814654f6021d17d8daad97212d -
Trigger Event:
push
-
Statement type: