Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface
Project description
Celeste AI
The primitive layer for multi-modal AI
All modalities. All providers. One interface.
Primitives, not frameworks.
🚀 This is the v1 Beta release. We're validating the new architecture before the stable v1.0 release. Feedback welcome!
Celeste AI
Type-safe, modality/provider-agnostic primitives.
- Unified Interface: One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
- True Multi-Modal: Text, Image, Audio, Video, Embeddings, Search —all first-class citizens.
- Type-Safe by Design: Full Pydantic validation and IDE autocomplete.
- Zero Lock-In: Switch providers instantly by changing a single config string.
- Primitives, Not Frameworks: No agents, no chains, no magic. Just clean I/O.
- Lightweight Architecture: No vendor SDKs. Pure, fast HTTP.
🚀 Quick Start
import celeste
# One SDK. Every modality. Any provider.
text = await celeste.text.generate("Explain quantum computing", model="claude-opus-4-5")
image = await celeste.images.generate("A serene mountain lake at dawn", model="flux-2-pro")
speech = await celeste.audio.speak("Welcome to the future", model="eleven_v3")
video = await celeste.videos.analyze(video_file, prompt="Summarize this clip", model="gemini-3-pro")
embeddings = await celeste.text.embed(["lorep ipsum", "dolor sit amet"], model="gemini-embedding-001")
Operations by Domain
| Action | Text | Images | Audio | Video |
|---|---|---|---|---|
| Generate | ✓ | ✓ | ○ | ✓ |
| Edit | — | ✓ | — | — |
| Analyze | — | ✓ | ✓ | ✓ |
| Upscale | — | ○ | — | ○ |
| Speak | — | — | ✓ | — |
| Transcribe | — | — | ✓ | — |
| Embed | ✓ | ○ | — | ○ |
✓ Available · ○ Planned
🔄 Switch providers in one line
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
# ❌ Anthropic Way
from anthropic import Anthropic
import json
client = Anthropic()
response = client.messages.create(
model=anthropic_model_id,
messages=[
{"role": "user",
"content": "Extract user info: John is 30"}
],
output_format={
"type": "json_schema",
"schema": User.model_json_schema()
}
)
user_data = json.loads(response.content[0].text)
# ❌ Google Gemini Way
from google import genai
from google.genai import types
client = genai.Client()
response = await client.aio.models.generate_content(
model=gemini_model_id,
contents="Extract user info: John is 30",
config=types.GenerateContentConfig(
response_mime_type="application/json",
response_schema=User
)
)
user = response.parsed
# ✅ Celeste Way
import celeste
response = await celeste.text.generate(
"Extract user info: John is 30",
model=google_model_id, # <--- Choose any model from any provider
output_schema=User, # <--- Unified parameter working across all providers
)
user = response.content # Already parsed as User instance
⚙️ Advanced: Create Client
For explicit configuration or client reuse, use create_client with modality + operation. This is modality-first: you choose the output type and operation explicitly.
from celeste import create_client, Modality, Operation, Provider
client = create_client(
modality=Modality.TEXT,
operation=Operation.GENERATE,
provider=Provider.OLLAMA,
model="llama3.2",
)
response = await client.generate("Extract user info: John is 30", output_schema=User)
capabilityis still supported but deprecated. Prefermodality+operation.
🪶 Install
uv add celeste-ai
# or
pip install celeste-ai
🔧 Type-Safe by Design
# Full IDE autocomplete
import celeste
response = await celeste.text.generate(
"Explain AI",
model="gpt-4o-mini",
temperature=0.7, # ✅ Validated (0.0-2.0)
max_tokens=100, # ✅ Validated (int)
)
# Typed response
print(response.content) # str (IDE knows the type)
print(response.usage.input_tokens) # int
print(response.metadata["model"]) # str
Catch errors before production.
🤝 Contributing
We welcome contributions! See CONTRIBUTING.md.
Request a provider: GitHub Issues Report bugs: GitHub Issues
📄 License
MIT license – see LICENSE for details.
Get Started • Documentation • GitHub
Made with ❤️ by developers tired of framework lock-in
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file celeste_ai-0.10.1.tar.gz.
File metadata
- Download URL: celeste_ai-0.10.1.tar.gz
- Upload date:
- Size: 196.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
272a0b35abfe3700de947eb0e56bc35c7fc5fdb5ab8f520d0af76a13543f0df7
|
|
| MD5 |
1c0205b781408e5d1218d10f927d765f
|
|
| BLAKE2b-256 |
3c529312eb85096781bd99bd2ae5ec3d4b45915ed61c7798764da2ed83351f0e
|
Provenance
The following attestation bundles were made for celeste_ai-0.10.1.tar.gz:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.10.1.tar.gz -
Subject digest:
272a0b35abfe3700de947eb0e56bc35c7fc5fdb5ab8f520d0af76a13543f0df7 - Sigstore transparency entry: 985326263
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@47780591efc3f32ffebb2ba46b000ff8ed489332 -
Branch / Tag:
refs/tags/v0.10.1 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@47780591efc3f32ffebb2ba46b000ff8ed489332 -
Trigger Event:
push
-
Statement type:
File details
Details for the file celeste_ai-0.10.1-py3-none-any.whl.
File metadata
- Download URL: celeste_ai-0.10.1-py3-none-any.whl
- Upload date:
- Size: 261.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02b84ea9be39bc95dd9430d40d10c74b1e454f50497ea42511cd8266f9057367
|
|
| MD5 |
3265794f54cbf5b9c62cb2adb85bb650
|
|
| BLAKE2b-256 |
524b07b2b063b90811bd475900c149051585907f020d99e8902e4a2891238318
|
Provenance
The following attestation bundles were made for celeste_ai-0.10.1-py3-none-any.whl:
Publisher:
publish.yml on withceleste/celeste-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
celeste_ai-0.10.1-py3-none-any.whl -
Subject digest:
02b84ea9be39bc95dd9430d40d10c74b1e454f50497ea42511cd8266f9057367 - Sigstore transparency entry: 985326287
- Sigstore integration time:
-
Permalink:
withceleste/celeste-python@47780591efc3f32ffebb2ba46b000ff8ed489332 -
Branch / Tag:
refs/tags/v0.10.1 - Owner: https://github.com/withceleste
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@47780591efc3f32ffebb2ba46b000ff8ed489332 -
Trigger Event:
push
-
Statement type: