Skip to main content

Open source, type-safe primitives for multi-modal AI. All capabilities, all providers, one interface

Project description

Celeste AI

Celeste Logo

The primitive layer for multi-modal AI

All capabilities. All providers. One interface.

Primitives, not frameworks.

Python License PyPI

Follow @withceleste on LinkedIn

Quick StartRequest Provider

Celeste AI

Type-safe, capability-provider-agnostic primitives .

  • Unified Interface: One API for OpenAI, Anthropic, Gemini, Mistral, and 14+ others.
  • True Multi-Modal: Text, Image, Audio, Video, Embeddings, Search —all first-class citizens.
  • Type-Safe by Design: Full Pydantic validation and IDE autocomplete.
  • Zero Lock-In: Switch providers instantly by changing a single config string.
  • Primitives, Not Frameworks: No agents, no chains, no magic. Just clean I/O.
  • Lightweight Architecture: No vendor SDKs. Pure, fast HTTP.

🚀 Quick Start

from celeste import create_client


# "We need a catchy slogan for our new eco-friendly sneaker."
client = create_client(
    capability="text-generation",
    model="gpt-5"
)
slogan = await client.generate("Write a slogan for an eco-friendly sneaker.")
print(slogan.content)

🎨 Multimodal example

from pydantic import BaseModel, Field

class ProductCampaign(BaseModel):
    visual_prompt: str
    audio_script: str

# 2. Extract Campaign Assets (Anthropic)
# -----------------------------------------------------
extract_client = create_client(Capability.TEXT_GENERATION, model="claude-opus-4-1")
campaign_output = await extract_client.generate(
    f"Create campaign assets for slogan: {slogan.content}",
    output_schema=ProductCampaign
)
campaign = campaign_output.content

# 3. Generate Ad Visual (Flux)
# -----------------------------------------------------
image_client = create_client(Capability.IMAGE_GENERATION, model="flux-2-flex")
image_output = await image_client.generate(
    campaign.visual_prompt,
    aspect_ratio="1:1"
)
image = image_output.content

# 4. Generate Radio Spot (ElevenLabs)
# -----------------------------------------------------
speech_client = create_client(Capability.SPEECH_GENERATION, model="eleven_v3")
speech_output = await speech_client.generate(
    campaign.audio_script,
    voice="adam"
)
speech = speech_output.content

No special cases. No separate libraries. One consistent interface.


15+ providers. Zero lock-in.

Google Anthropic OpenAI Mistral Cohere xAI DeepSeek Groq Perplexity Ollama Hugging Face Replicate Stability AI Runway ElevenLabs

and many more

Missing a provider? Request it – ⚡ we ship fast.


🔄 Switch providers in one line

from pydantic import BaseModel

class User(BaseModel):
    name: str
    age: int

# Model IDs
anthropic_model_id = "claude-4-5-sonnet"
google_model_id = "gemini-2.5-flash"
# ❌ Anthropic Way
from anthropic import Anthropic
import json

client = Anthropic()
response = client.messages.create(
    model=anthropic_model_id,
    messages=[
        {"role": "user",
         "content": "Extract user info: John is 30"}
    ],
    output_format={
        "type": "json_schema",
        "schema": User.model_json_schema()
    }
)
user_data = json.loads(response.content[0].text)
# ❌ Google Gemini Way
from google import genai
from google.genai import types

client = genai.Client()
response = await client.aio.models.generate_content(
    model=gemini_model_id,
    contents="Extract user info: John is 30",
    config=types.GenerateContentConfig(
        response_mime_type="application/json",
        response_schema=User
    )
)
user = response.parsed
# ✅ Celeste Way
from celeste import create_client, Capability


client = create_client(
    Capability.TEXT_GENERATION,
    model=google_model_id  # <--- Choose any model from any provider
)

response = await client.generate(
    prompt="Extract user info: John is 30",
    output_schema=User  # <--- Unified parameter working across all providers
)
user = response.content  # Already parsed as User instance

🪶 Install what you need

uv add "celeste-ai[text-generation]"  # Text only
uv add "celeste-ai[image-generation]" # Image generation
uv add "celeste-ai[all]"              # Everything

🔧 Type-Safe by Design

# Full IDE autocomplete
response = await client.generate(
    prompt="Explain AI",
    temperature=0.7,    # ✅ Validated (0.0-2.0)
    max_tokens=100,     # ✅ Validated (int)
)

# Typed response
print(response.content)              # str (IDE knows the type)
print(response.usage.input_tokens)   # int
print(response.metadata["model"])     # str

Catch errors before production.


🤝 Contributing

We welcome contributions! See CONTRIBUTING.md.

Request a provider: GitHub Issues Report bugs: GitHub Issues


📄 License

MIT license – see LICENSE for details.


Get StartedDocumentationGitHub

Made with ❤️ by developers tired of framework lock-in

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

celeste_ai-0.3.4.tar.gz (139.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

celeste_ai-0.3.4-py3-none-any.whl (29.5 kB view details)

Uploaded Python 3

File details

Details for the file celeste_ai-0.3.4.tar.gz.

File metadata

  • Download URL: celeste_ai-0.3.4.tar.gz
  • Upload date:
  • Size: 139.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for celeste_ai-0.3.4.tar.gz
Algorithm Hash digest
SHA256 46a48c54bea75942ef19673b687637036ab7206d2f5fb38f80b4fea9950a1bca
MD5 c918296fc47bb973dbdcb6b20217fcad
BLAKE2b-256 7305c7beca3f560fadefe6d74d55cb8dd826df03ea9871dd5d3e49b2b5b57c7a

See more details on using hashes here.

Provenance

The following attestation bundles were made for celeste_ai-0.3.4.tar.gz:

Publisher: publish.yml on withceleste/celeste-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file celeste_ai-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: celeste_ai-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 29.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for celeste_ai-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 dde82a147c0117b06d3cc8e85f10147a3aa7b60cbb94d3d85820f5a4ebd36273
MD5 8306607cf24ede63be9333f9010c9660
BLAKE2b-256 99d698e94e0427ebc4236cee3a7761a770e9092873a38fd852959e481e0c9c5c

See more details on using hashes here.

Provenance

The following attestation bundles were made for celeste_ai-0.3.4-py3-none-any.whl:

Publisher: publish.yml on withceleste/celeste-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page