Skip to main content

A production-ready framework for building safe, capable, and observable AI agents.

Project description

๐ŸŒŸ Pycelest โ€” AI Agent Framework

Build ยท Orchestrate ยท Defend A production-ready framework for building safe, capable, and observable AI agents.

PyPI version Python 3.11+ License: MIT Code style: ruff


What is Pycelest?

Pycelest is a Python framework for building AI agents that are safe, capable, and observable โ€” designed from day one for production deployment, not just research.

Unlike frameworks that bolt guardrails on as an afterthought, Pycelest puts security, traceability, and reliability at the center of its architecture.


Features

Feature Description
๐Ÿ” ReAct Loop Native Reasoning + Acting loop at the core
๐ŸŒ Multi-Provider OpenAI, Anthropic, Google, Mistral, DeepSeek, Grok โ€” one interface
๐Ÿ  Local Models Ollama, LM Studio, vLLM โ€” no API key needed
๐Ÿง  Memory STM, Scratchpad, RAG, and automatic compression
๐Ÿ›ก๏ธ Guardrails Built-in Tool Firewall and Execution Budget
๐Ÿ‘ฅ Multi-Agent Native agent collaboration via AgentBus
๐Ÿ“ก Streaming First-class async streaming from all providers
๐Ÿ”ญ Observability OpenTelemetry traces, metrics, and structured logs
๐Ÿ”Œ Plugin System Lifecycle hooks for extending behavior
โš™๏ธ YAML Config Code or YAML โ€” your choice
๐Ÿ”ง MCP Support Connect external tool servers via Model Context Protocol

Installation

pip install pycelest

# With specific provider
pip install pycelest[openai]      # OpenAI, Google, DeepSeek, Grok, Mistral
pip install pycelest[anthropic]   # Anthropic Claude

# Local models โ€” no API key needed
# Install Ollama from https://ollama.com, then:
# ollama pull llama3.2

# Everything
pip install pycelest[all]

Quick Start

from celest import SessionManager, SessionConfig
from celest.providers import OpenAIAdapter

config = SessionConfig(
    system_prompt="You are a helpful, careful AI agent",
    max_iterations=8,
    max_tool_executions=10,
    token_budget=12_000,
)

session = SessionManager(
    config=config,
    provider=OpenAIAdapter(model="gpt-4o"),
)

result = await session.run("Plan a 3-day trip to Kyoto")
print(result.response)

Local Models (no API key)

from celest.providers import OllamaAdapter

session = SessionManager(
    config=config,
    provider=OllamaAdapter(model="llama3.2"),  # or mistral, phi3, qwen2.5...
)

With Tools

from celest.tools import FunctionTool

@FunctionTool.register(description="Search the web for current information")
async def web_search(query: str) -> str:
    # your implementation
    ...

session = SessionManager(config=config, provider=provider, tools=[web_search])
result = await session.run("What are the latest AI news?")

With YAML Config

# celest.yaml
system_prompt: "You are a helpful AI agent"
max_iterations: 8
provider: openai      # openai | anthropic | ollama | lmstudio | deepseek | grok | mistral
model: gpt-4o
guardrails:
  tool_firewall: ask  # accept | deny | ask
from celest import SessionManager

session = SessionManager.from_yaml("celest.yaml")
result = await session.run("Your task here")

CLI

celest init                          # Generate a starter celest.yaml
celest run celest.yaml "Your prompt" # Run an agent from config

Multi-Agent

from celest.multi import AgentBus

bus = AgentBus()
researcher = SessionManager(config=research_config, provider=provider, bus=bus)
writer = SessionManager(config=write_config, provider=provider, bus=bus)

result = await researcher.run("Research and write a report on AI trends")

Architecture

User Input
    โ”‚
    โ–ผ
[Optional: PlanningModule] โ”€โ”€โ–บ Goal decomposition + SkillRegistry
    โ”‚
    โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚         SessionManager              โ”‚
โ”‚         (ReAct Loop)                โ”‚
โ”‚                                     โ”‚
โ”‚  ConversationHistory  MemoryManager โ”‚
โ”‚  ToolRegistry         RAGAdapter    โ”‚
โ”‚  PlanningModule       Logger        โ”‚
โ”‚  Compression          ExecBudget    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
               โ”‚
       โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
       โ–ผ                โ–ผ
  ProviderAdapter   ToolFirewall
  (LLM API)         (Guardrails)

Roadmap

  • Project scaffold & specification
  • Phase 1 โ€” Core: SessionManager, ReAct loop, ProviderAdapters, FunctionTool, Guardrails
  • Phase 2 โ€” Memory: STM, Scratchpad, RAG, Compression, Streaming, OpenTelemetry
  • Phase 3 โ€” Advanced: Plugin system, AgentBus, PlanningModule, MCP, CLI, Local models

Contributing

Contributions are welcome! Please open an issue or submit a PR on GitHub.


License

MIT ยฉ Celestin

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycelest-0.1.2.tar.gz (300.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycelest-0.1.2-py3-none-any.whl (43.9 kB view details)

Uploaded Python 3

File details

Details for the file pycelest-0.1.2.tar.gz.

File metadata

  • Download URL: pycelest-0.1.2.tar.gz
  • Upload date:
  • Size: 300.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for pycelest-0.1.2.tar.gz
Algorithm Hash digest
SHA256 d83b16dab4b7a48bb02f2e1f11c95ebb15fe4a9566f17899d18dd741382d66f0
MD5 81ca6d3870ab18b3a9680fa58d1cf76f
BLAKE2b-256 1ef9b7a83e7e117abe90a30405c10c073add28e113211a2d14ff876e51d97eb3

See more details on using hashes here.

File details

Details for the file pycelest-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: pycelest-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 43.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for pycelest-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1c89775712026889d24641ae76ff4260192aa7984fb38171ba45c9b74635a73d
MD5 ed12b7122496f9d09f1176e9403f3216
BLAKE2b-256 3760cb17f58c29192cb79623bd58cc450748152f6c53c9de3b2e306eaa35206b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page