Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface
Project description
Celeste AI
The primitive layer for multi-modal AI
All capabilities. All providers. One interface.
Primitives, not frameworks.
Celeste AI
Type-safe, capability-provider-agnostic primitives .
- Unified Interface: One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
- True Multi-Modal: Text, Image, Audio, Video, Embeddings, Search —all first-class citizens.
- Type-Safe by Design: Full Pydantic validation and IDE autocomplete.
- Zero Lock-In: Switch providers instantly by changing a single config string.
- Primitives, Not Frameworks: No agents, no chains, no magic. Just clean I/O.
- Lightweight Architecture: No vendor SDKs. Pure, fast HTTP.
🚀 Quick Start
from celeste import create_client
# "We need a catchy slogan for our new eco-friendly sneaker."
client = create_client(
capability="text-generation",
model="gpt-5"
)
slogan = await client.generate("Write a slogan for an eco-friendly sneaker.")
print(slogan.content)
🎨 Multimodal example
from pydantic import BaseModel, Field
class ProductCampaign(BaseModel):
visual_prompt: str
audio_script: str
# 2. Extract Campaign Assets (Anthropic)
# -----------------------------------------------------
extract_client = create_client(Capability.TEXT_GENERATION, model="claude-opus-4-1")
campaign_output = await extract_client.generate(
f"Create campaign assets for slogan: {slogan.content}",
output_schema=ProductCampaign
)
campaign = campaign_output.content
# 3. Generate Ad Visual (Flux)
# -----------------------------------------------------
image_client = create_client(Capability.IMAGE_GENERATION, model="flux-2-flex")
image_output = await image_client.generate(
campaign.visual_prompt,
aspect_ratio="1:1"
)
image = image_output.content
# 4. Generate Radio Spot (ElevenLabs)
# -----------------------------------------------------
speech_client = create_client(Capability.SPEECH_GENERATION, model="eleven_v3")
speech_output = await speech_client.generate(
campaign.audio_script,
voice="adam"
)
speech = speech_output.content
No special cases. No separate libraries. One consistent interface.
🔄 Switch providers in one line
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
# ❌ Anthropic Way
from anthropic import Anthropic
import json
client = Anthropic()
response = client.messages.create(
model=anthropic_model_id,
messages=[
{"role": "user",
"content": "Extract user info: John is 30"}
],
output_format={
"type": "json_schema",
"schema": User.model_json_schema()
}
)
user_data = json.loads(response.content[0].text)
# ❌ Google Gemini Way
from google import genai
from google.genai import types
client = genai.Client()
response = await client.aio.models.generate_content(
model=gemini_model_id,
contents="Extract user info: John is 30",
config=types.GenerateContentConfig(
response_mime_type="application/json",
response_schema=User
)
)
user = response.parsed
# ✅ Celeste Way
from celeste import create_client, Capability
client = create_client(
Capability.TEXT_GENERATION,
model=google_model_id # <--- Choose any model from any provider
)
response = await client.generate(
prompt="Extract user info: John is 30",
output_schema=User # <--- Unified parameter working across all providers
)
user = response.content # Already parsed as User instance
🪶 Install what you need
uv add "celeste-ai[text-generation]" # Text only
uv add "celeste-ai[image-generation]" # Image generation
uv add "celeste-ai[all]" # Everything
🔧 Type-Safe by Design
# Full IDE autocomplete
response = await client.generate(
prompt="Explain AI",
temperature=0.7, # ✅ Validated (0.0-2.0)
max_tokens=100, # ✅ Validated (int)
)
# Typed response
print(response.content) # str (IDE knows the type)
print(response.usage.input_tokens) # int
print(response.metadata["model"]) # str
Catch errors before production.
🤝 Contributing
We welcome contributions! See CONTRIBUTING.md.
Request a provider: GitHub Issues Report bugs: GitHub Issues
📄 License
MIT license – see LICENSE for details.
Get Started • Documentation • GitHub
Made with ❤️ by developers tired of framework lock-in
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file celeste_ai-0.3.1.tar.gz.
File metadata
- Download URL: celeste_ai-0.3.1.tar.gz
- Upload date:
- Size: 136.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02e5ec2f75e8c8af3bda63b7112b9ea395d2482ea43ce4465c67cf94a564941c
|
|
| MD5 |
d970a739607e55611a3aaef39909f6fa
|
|
| BLAKE2b-256 |
a29be452c8390b3f4581a22a8c6ef3cfd7888930823af1524196a6e821b6200a
|
Provenance
The following attestation bundles were made for celeste_ai-0.3.1.tar.gz:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.3.1.tar.gz -
Subject digest:
02e5ec2f75e8c8af3bda63b7112b9ea395d2482ea43ce4465c67cf94a564941c - Sigstore transparency entry: 773050417
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@5ee60ba930aea533f50b5b9b6f9c7761f2c69c9d -
Branch / Tag:
refs/tags/v0.3.1 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5ee60ba930aea533f50b5b9b6f9c7761f2c69c9d -
Trigger Event:
push
-
Statement type:
File details
Details for the file celeste_ai-0.3.1-py3-none-any.whl.
File metadata
- Download URL: celeste_ai-0.3.1-py3-none-any.whl
- Upload date:
- Size: 29.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9733aeb24ccc3c3cbd0135f384047444df5e50c596232f478f967e64a093cc7f
|
|
| MD5 |
2c8a6bd1c1b0ec287ffefab9c562047e
|
|
| BLAKE2b-256 |
382aaeff9991b3a675020c6d4419a50212759e00a4af27c1eccd804a2bba52f6
|
Provenance
The following attestation bundles were made for celeste_ai-0.3.1-py3-none-any.whl:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.3.1-py3-none-any.whl -
Subject digest:
9733aeb24ccc3c3cbd0135f384047444df5e50c596232f478f967e64a093cc7f - Sigstore transparency entry: 773050882
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@5ee60ba930aea533f50b5b9b6f9c7761f2c69c9d -
Branch / Tag:
refs/tags/v0.3.1 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5ee60ba930aea533f50b5b9b6f9c7761f2c69c9d -
Trigger Event:
push
-
Statement type: