Skip to main content

A production-ready framework for building safe, capable, and observable AI agents.

Project description

๐ŸŒŸ Pycelest โ€” AI Agent Framework

Build ยท Orchestrate ยท Defend A production-ready framework for building safe, capable, and observable AI agents.

PyPI version Python 3.11+ License: MIT Code style: ruff


What is Pycelest?

Pycelest is a Python framework for building AI agents that are safe, capable, and observable โ€” designed from day one for production deployment, not just research.

Unlike frameworks that bolt guardrails on as an afterthought, Pycelest puts security, traceability, and reliability at the center of its architecture.


Features

Feature Description
๐Ÿ” ReAct Loop Native Reasoning + Acting loop at the core
๐ŸŒ Multi-Provider OpenAI, Anthropic, Google, Mistral, DeepSeek, Grok โ€” one interface
๐Ÿ  Local Models Ollama, LM Studio, vLLM โ€” no API key needed
๐Ÿง  Memory STM, Scratchpad, RAG, and automatic compression
๐Ÿ›ก๏ธ Guardrails Built-in Tool Firewall and Execution Budget
๐Ÿ‘ฅ Multi-Agent Native agent collaboration via AgentBus
๐Ÿ“ก Streaming First-class async streaming from all providers
๐Ÿ”ญ Observability OpenTelemetry traces, metrics, and structured logs
๐Ÿ”Œ Plugin System Lifecycle hooks for extending behavior
โš™๏ธ YAML Config Code or YAML โ€” your choice
๐Ÿ”ง MCP Support Connect external tool servers via Model Context Protocol

Installation

pip install pycelest

# With specific provider
pip install pycelest[openai]      # OpenAI, Google, DeepSeek, Grok, Mistral
pip install pycelest[anthropic]   # Anthropic Claude

# Local models โ€” no API key needed
# Install Ollama from https://ollama.com, then:
# ollama pull llama3.2

# Everything
pip install pycelest[all]

Quick Start

from celest import SessionManager, SessionConfig
from celest.providers import OpenAIAdapter

config = SessionConfig(
    system_prompt="You are a helpful, careful AI agent",
    max_iterations=8,
    max_tool_executions=10,
    token_budget=12_000,
)

session = SessionManager(
    config=config,
    provider=OpenAIAdapter(model="gpt-4o"),
)

result = await session.run("Plan a 3-day trip to Kyoto")
print(result.response)

Local Models (no API key)

from celest.providers import OllamaAdapter

session = SessionManager(
    config=config,
    provider=OllamaAdapter(model="llama3.2"),  # or mistral, phi3, qwen2.5...
)

With Tools

from celest.tools import FunctionTool

@FunctionTool.register(description="Search the web for current information")
async def web_search(query: str) -> str:
    # your implementation
    ...

session = SessionManager(config=config, provider=provider, tools=[web_search])
result = await session.run("What are the latest AI news?")

With YAML Config

# celest.yaml
system_prompt: "You are a helpful AI agent"
max_iterations: 8
provider: openai      # openai | anthropic | ollama | lmstudio | deepseek | grok | mistral
model: gpt-4o
guardrails:
  tool_firewall: ask  # accept | deny | ask
from celest import SessionManager

session = SessionManager.from_yaml("celest.yaml")
result = await session.run("Your task here")

CLI

celest init                          # Generate a starter celest.yaml
celest run celest.yaml "Your prompt" # Run an agent from config

Multi-Agent

from celest.multi import AgentBus

bus = AgentBus()
researcher = SessionManager(config=research_config, provider=provider, bus=bus)
writer = SessionManager(config=write_config, provider=provider, bus=bus)

result = await researcher.run("Research and write a report on AI trends")

Architecture

User Input
    โ”‚
    โ–ผ
[Optional: PlanningModule] โ”€โ”€โ–บ Goal decomposition + SkillRegistry
    โ”‚
    โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚         SessionManager              โ”‚
โ”‚         (ReAct Loop)                โ”‚
โ”‚                                     โ”‚
โ”‚  ConversationHistory  MemoryManager โ”‚
โ”‚  ToolRegistry         RAGAdapter    โ”‚
โ”‚  PlanningModule       Logger        โ”‚
โ”‚  Compression          ExecBudget    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
               โ”‚
       โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
       โ–ผ                โ–ผ
  ProviderAdapter   ToolFirewall
  (LLM API)         (Guardrails)

Roadmap

  • Project scaffold & specification
  • Phase 1 โ€” Core: SessionManager, ReAct loop, ProviderAdapters, FunctionTool, Guardrails
  • Phase 2 โ€” Memory: STM, Scratchpad, RAG, Compression, Streaming, OpenTelemetry
  • Phase 3 โ€” Advanced: Plugin system, AgentBus, PlanningModule, MCP, CLI, Local models

Contributing

Contributions are welcome! Please open an issue or submit a PR on GitHub.


License

MIT ยฉ Celestin

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycelest-0.2.0-py3-none-any.whl (52.6 kB view details)

Uploaded Python 3

File details

Details for the file pycelest-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: pycelest-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 52.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for pycelest-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4b5ff61e6ca378e575eed604491f33aa7fa949d86c84b5942c7fb877cc8d0217
MD5 b93d04123491b14a131f6e4a862f48bc
BLAKE2b-256 6d69f2e3c0a9bdd75381d60513c1c329fc488d047a9832727d9c3a1c166116e9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page