A production-ready framework for building safe, capable, and observable AI agents.
Project description
๐ Pycelest โ AI Agent Framework
Build ยท Orchestrate ยท Defend A production-ready framework for building safe, capable, and observable AI agents.
What is Pycelest?
Pycelest is a Python framework for building AI agents that are safe, capable, and observable โ designed from day one for production deployment, not just research.
Unlike frameworks that bolt guardrails on as an afterthought, Pycelest puts security, traceability, and reliability at the center of its architecture.
Features
| Feature | Description |
|---|---|
| ๐ ReAct Loop | Native Reasoning + Acting loop at the core |
| ๐ Multi-Provider | OpenAI, Anthropic, Google, Mistral, DeepSeek, Grok โ one interface |
| ๐ Local Models | Ollama, LM Studio, vLLM โ no API key needed |
| ๐ง Memory | STM, Scratchpad, RAG, and automatic compression |
| ๐ก๏ธ Guardrails | Built-in Tool Firewall and Execution Budget |
| ๐ฅ Multi-Agent | Native agent collaboration via AgentBus |
| ๐ก Streaming | First-class async streaming from all providers |
| ๐ญ Observability | OpenTelemetry traces, metrics, and structured logs |
| ๐ Plugin System | Lifecycle hooks for extending behavior |
| โ๏ธ YAML Config | Code or YAML โ your choice |
| ๐ง MCP Support | Connect external tool servers via Model Context Protocol |
Installation
pip install pycelest
# With specific provider
pip install pycelest[openai] # OpenAI, Google, DeepSeek, Grok, Mistral
pip install pycelest[anthropic] # Anthropic Claude
# Local models โ no API key needed
# Install Ollama from https://ollama.com, then:
# ollama pull llama3.2
# Everything
pip install pycelest[all]
Quick Start
from celest import SessionManager, SessionConfig
from celest.providers import OpenAIAdapter
config = SessionConfig(
system_prompt="You are a helpful, careful AI agent",
max_iterations=8,
max_tool_executions=10,
token_budget=12_000,
)
session = SessionManager(
config=config,
provider=OpenAIAdapter(model="gpt-4o"),
)
result = await session.run("Plan a 3-day trip to Kyoto")
print(result.response)
Local Models (no API key)
from celest.providers import OllamaAdapter
session = SessionManager(
config=config,
provider=OllamaAdapter(model="llama3.2"), # or mistral, phi3, qwen2.5...
)
With Tools
from celest.tools import FunctionTool
@FunctionTool.register(description="Search the web for current information")
async def web_search(query: str) -> str:
# your implementation
...
session = SessionManager(config=config, provider=provider, tools=[web_search])
result = await session.run("What are the latest AI news?")
With YAML Config
# celest.yaml
system_prompt: "You are a helpful AI agent"
max_iterations: 8
provider: openai # openai | anthropic | ollama | lmstudio | deepseek | grok | mistral
model: gpt-4o
guardrails:
tool_firewall: ask # accept | deny | ask
from celest import SessionManager
session = SessionManager.from_yaml("celest.yaml")
result = await session.run("Your task here")
CLI
celest init # Generate a starter celest.yaml
celest run celest.yaml "Your prompt" # Run an agent from config
Multi-Agent
from celest.multi import AgentBus
bus = AgentBus()
researcher = SessionManager(config=research_config, provider=provider, bus=bus)
writer = SessionManager(config=write_config, provider=provider, bus=bus)
result = await researcher.run("Research and write a report on AI trends")
Architecture
User Input
โ
โผ
[Optional: PlanningModule] โโโบ Goal decomposition + SkillRegistry
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ SessionManager โ
โ (ReAct Loop) โ
โ โ
โ ConversationHistory MemoryManager โ
โ ToolRegistry RAGAdapter โ
โ PlanningModule Logger โ
โ Compression ExecBudget โ
โโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโ
โ
โโโโโโโโโดโโโโโโโโโ
โผ โผ
ProviderAdapter ToolFirewall
(LLM API) (Guardrails)
Roadmap
- Project scaffold & specification
- Phase 1 โ Core: SessionManager, ReAct loop, ProviderAdapters, FunctionTool, Guardrails
- Phase 2 โ Memory: STM, Scratchpad, RAG, Compression, Streaming, OpenTelemetry
- Phase 3 โ Advanced: Plugin system, AgentBus, PlanningModule, MCP, CLI, Local models
Contributing
Contributions are welcome! Please open an issue or submit a PR on GitHub.
License
MIT ยฉ Celestin
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pycelest-0.2.0-py3-none-any.whl.
File metadata
- Download URL: pycelest-0.2.0-py3-none-any.whl
- Upload date:
- Size: 52.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4b5ff61e6ca378e575eed604491f33aa7fa949d86c84b5942c7fb877cc8d0217
|
|
| MD5 |
b93d04123491b14a131f6e4a862f48bc
|
|
| BLAKE2b-256 |
6d69f2e3c0a9bdd75381d60513c1c329fc488d047a9832727d9c3a1c166116e9
|