Skip to main content

Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface

Project description

Celeste AI

Celeste Logo

The primitive layer for multi-modal AI

All modalities. All providers. One interface.

Primitives, not frameworks.

Python License PyPI

Follow @withceleste on LinkedIn

Quick StartRequest Provider

🚀 This is the v1 Beta release. We're validating the new architecture before the stable v1.0 release. Feedback welcome!

Celeste AI

Type-safe, modality/provider-agnostic primitives.

  • Unified Interface: One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
  • True Multi-Modal: Text, Image, Audio, Video, Embeddings, Search —all first-class citizens.
  • Type-Safe by Design: Full Pydantic validation and IDE autocomplete.
  • Zero Lock-In: Switch providers instantly by changing a single config string.
  • Primitives, Not Frameworks: No agents, no chains, no magic. Just clean I/O.
  • Lightweight Architecture: No vendor SDKs. Pure, fast HTTP.

🚀 Quick Start

import celeste

# "We need a catchy slogan for our new eco-friendly sneaker."
slogan = await celeste.text.generate(
    "Write a slogan for an eco-friendly sneaker.",
    model="gpt-5",
)
print(slogan.content)

🎨 Multimodal example

import celeste
from pydantic import BaseModel, Field

class ProductCampaign(BaseModel):
    visual_prompt: str
    audio_script: str

# 2. Extract Campaign Assets (Anthropic)
# -----------------------------------------------------
campaign_output = await celeste.text.generate(
    f"Create campaign assets for slogan: {slogan.content}",
    model="claude-opus-4-1",
    output_schema=ProductCampaign,
)
campaign = campaign_output.content

# 3. Generate Ad Visual (Flux)
# -----------------------------------------------------
image_output = await celeste.images.generate(
    campaign.visual_prompt,
    model="flux-2-flex",
    aspect_ratio="1:1"
)
image = image_output.content

# 4. Generate Radio Spot (ElevenLabs)
# -----------------------------------------------------
speech_output = await celeste.audio.speak(
    campaign.audio_script,
    model="eleven_v3",
    voice="adam"
)
speech = speech_output.content

No special cases. No separate libraries. One consistent interface.


15+ providers. Zero lock-in.

Google Anthropic OpenAI Mistral Cohere xAI DeepSeek Groq Perplexity Ollama Hugging Face Replicate Stability AI Runway ElevenLabs

and many more

Missing a provider? Request it – ⚡ we ship fast.


🔄 Switch providers in one line

from pydantic import BaseModel

class User(BaseModel):
    name: str
    age: int

# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
# ❌ Anthropic Way
from anthropic import Anthropic
import json

client = Anthropic()
response = client.messages.create(
    model=anthropic_model_id,
    messages=[
        {"role": "user",
         "content": "Extract user info: John is 30"}
    ],
    output_format={
        "type": "json_schema",
        "schema": User.model_json_schema()
    }
)
user_data = json.loads(response.content[0].text)
# ❌ Google Gemini Way
from google import genai
from google.genai import types

client = genai.Client()
response = await client.aio.models.generate_content(
    model=gemini_model_id,
    contents="Extract user info: John is 30",
    config=types.GenerateContentConfig(
        response_mime_type="application/json",
        response_schema=User
    )
)
user = response.parsed
# ✅ Celeste Way
import celeste

response = await celeste.text.generate(
    "Extract user info: John is 30",
    model=google_model_id,  # <--- Choose any model from any provider
    output_schema=User,  # <--- Unified parameter working across all providers
)
user = response.content  # Already parsed as User instance

🧭 Namespace API (recommended)

Namespaces are domain-first: start from the resource you want to work with (e.g., videos) even if the input is text. Under the hood, Celeste maps (domain, operation) to the output modality (e.g., celeste.images.analyze(...) routes to the text modality because analysis returns text).

import celeste

# Async (default)
result = await celeste.images.analyze(
    image=img,
    prompt="Describe this image",
    model="gpt-4o"
)

# Sync
result = celeste.images.sync.analyze(
    image=img,
    prompt="Describe this image",
    model="gpt-4o"
)

# Async streaming
async for chunk in celeste.text.stream.generate("Hello", model="gpt-4o"):
    print(chunk.content, end="")

# Sync streaming
for chunk in celeste.text.sync.stream.generate("Hello", model="gpt-4o"):
    print(chunk.content, end="")

⚙️ Advanced: create_client

For explicit configuration or client reuse, use create_client with modality + operation. This is modality-first: you choose the output type and operation explicitly.

from celeste import create_client, Modality, Operation

client = create_client(
    modality=Modality.TEXT,
    operation=Operation.GENERATE,
    model=google_model_id,
)
response = await client.generate("Extract user info: John is 30", output_schema=User)

capability is still supported but deprecated. Prefer modality + operation.


🪶 Install

pip install celeste-ai
# or
uv add celeste-ai

🔁 Behavior changes since v0.3.9

  • Capabilities → modalities + operations.
  • Namespace API is now the default entry point.
  • create_client now uses modality + operation; capability is deprecated.
  • analyze for image/audio/video routes through the text modality.
  • Namespaces are domain-first (resource you work with); create_client is modality-first (output type). Domain + operation maps to modality.
  • extra_body allows provider-specific parameters without first-class mapping.
  • Single-package install (no extras).

🔧 Type-Safe by Design

# Full IDE autocomplete
import celeste

response = await celeste.text.generate(
    "Explain AI",
    model="gpt-4o-mini",
    temperature=0.7,    # ✅ Validated (0.0-2.0)
    max_tokens=100,     # ✅ Validated (int)
)

# Typed response
print(response.content)              # str (IDE knows the type)
print(response.usage.input_tokens)   # int
print(response.metadata["model"])     # str

Catch errors before production.


🤝 Contributing

We welcome contributions! See CONTRIBUTING.md.

Request a provider: GitHub Issues Report bugs: GitHub Issues


📄 License

MIT license – see LICENSE for details.


Get StartedDocumentationGitHub

Made with ❤️ by developers tired of framework lock-in

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

celeste_ai-0.9.1.tar.gz (191.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

celeste_ai-0.9.1-py3-none-any.whl (260.7 kB view details)

Uploaded Python 3

File details

Details for the file celeste_ai-0.9.1.tar.gz.

File metadata

  • Download URL: celeste_ai-0.9.1.tar.gz
  • Upload date:
  • Size: 191.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for celeste_ai-0.9.1.tar.gz
Algorithm Hash digest
SHA256 f0db477f12707d166a0d616534473569e87091f87851c6d7597b09c42de6c374
MD5 89c26dc2111907d71876304714a28153
BLAKE2b-256 e47d7415fc15f6e4c19e7216a0f3d1052d381e9a9d729364ad818ac3c79cb2cb

See more details on using hashes here.

Provenance

The following attestation bundles were made for celeste_ai-0.9.1.tar.gz:

Publisher: publish.yml on withceleste/celeste-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file celeste_ai-0.9.1-py3-none-any.whl.

File metadata

  • Download URL: celeste_ai-0.9.1-py3-none-any.whl
  • Upload date:
  • Size: 260.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for celeste_ai-0.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5dfee21a655a68287adbb2de2bd20947408dfc9c3703c713534916651f45fa6e
MD5 d4e879c37c8c9dd4a552994657f2e4a2
BLAKE2b-256 38b222eca17bfdb67e590a1b532701304d3e5a4b9ea676ff8e207cbda4b06c99

See more details on using hashes here.

Provenance

The following attestation bundles were made for celeste_ai-0.9.1-py3-none-any.whl:

Publisher: publish.yml on withceleste/celeste-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page