Skip to main content

Fastest way to build, prototype and deploy AI Agents.

Project description

project logo

Fastest way to build and deploy long-running AI agents—with durability, observability, and security.

Docs | Examples

JavaScript/TypeScript SDK: Gatekeeper SDK docs

Features

Feature Description Docs
🚀 MCP & tool security The only full FastAPI compatible MCP Server with decorator API Link
🦾 Agent-to-agent Multi-agent communication Link
☁️ Deployment Fast serverless deployment Link
📊 Observability Agent tracing and monitoring Link
🔍 Tool Search API Reduced tool context bloat Link

🚅 Quick Start

Installation

The recommended method of installing agentor is with pip from PyPI.

pip install agentor
More ways...

You can also install the latest bleeding edge version (could be unstable) of agentor, should you feel motivated enough, as follows:

pip install git+https://github.com/celestoai/agentor@main

Build and Deploy an Agent

Build an Agent, connect external tools or MCP Server and serve as an API in just a few lines of code:

from agentor.tools import GetWeatherTool
from agentor import Agentor

agent = Agentor(
    name="Weather Agent",
    model="gpt-5-mini",  # Use any LLM provider - gemini/gemini-2.5-pro or anthropic/claude-3.5
    tools=[GetWeatherTool()]
)
result = agent.run("What is the weather in London?")  # Run the Agent
print(result)

# Serve Agent with a single line of code
agent.serve()

Run the following command to query the Agent server:

curl -X 'POST' \
  'http://localhost:8000/chat' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "input": "What is the weather in London?"
}'

Celesto AI provides a developer-first platform for deployment of Agents, MCP Servers, any LLM application. The celesto CLI is installed automatically with agentor.

To deploy using Celesto, run:

celesto deploy

Once deployed, your agent will be accessible via a REST endpoint, for example:

https://api.celesto.ai/deploy/apps/<app-name>

Agent Skills

Skills are folders of instructions, scripts, and resources that Claude loads dynamically to improve performance on specialized tasks.

Agent Skills help agents pull just the right context from simple Markdown files. The agent first sees only a skill’s name and short description. When the task matches, it loads the rest of SKILL.md, follows the steps, and can call a shell environment to run the commands the skill points to.

  • Starts light: discover skills by name/description only
  • Loads on demand: pull full instructions from SKILL.md when relevant
  • Executes safely: run skill-driven commands in an isolated shell

Skill layout example:

example-skill/
├── SKILL.md        # required instructions + metadata
├── scripts/        # optional helpers the agent can call
├── assets/         # optional templates/resources
└── references/     # optional docs or checklists

Using a skill to create a GIF:

from agentor.tools import ShellTool
from agentor import Agentor

agent = Agentor(
    name="Assistant",
    model="gemini/gemini-3-flash-preview",
    instructions="Your job is to create GIFs. Lean on the shell tool and any available skills.",
    skills=[".skills/slack-gif-creator"],
    tools=[ShellTool()],
)

async for chunk in await agent.chat("produce a cat gif", stream=True):
    print(chunk)

Create an Agent from Markdown

Bootstrap an Agent directly from a markdown file with metadata for name, tools, model, and temperature:

---
name: WeatherBot
tools: [get_weather]
model: gpt-4o-mini
temperature: 0.3
---
You are a concise weather assistant.

Load it with:

from agentor import Agentor

agent = Agentor.from_md("agent.md")
result = agent.run("Weather in Paris?")

Build a custom MCP Server with LiteMCP

Agentor enables you to build a custom MCP Server using LiteMCP. You can run it inside a FastAPI application or as a standalone MCP server.

from agentor.mcp import LiteMCP, get_token

mcp = LiteMCP(name="my-server", version="1.0.0")

@mcp.tool(description="Get weather for a given location")
def get_weather(location: str) -> str:

    # *********** Control authentication ***********
    token = get_token()
    if token != "SOME_SECRET":
        return "Not authorized"

    return f"Weather in {location}: Sunny, 72°F"

mcp.serve()

LiteMCP vs FastMCP

Key Difference: LiteMCP is a native ASGI app that integrates directly with FastAPI using standard patterns. FastMCP requires mounting as a sub-application, diverging from standard FastAPI primitives.

Feature LiteMCP FastMCP
Integration Native ASGI Requires mounting
FastAPI Patterns ✅ Standard ⚠️ Diverges
Built-in CORS
Custom Methods ✅ Full ⚠️ Limited
With Existing Backend ✅ Easy ⚠️ Complex

📖 Learn more

Agent-to-Agent (A2A) Protocol

The A2A Protocol defines standard specifications for agent communication and message formatting, enabling seamless interoperability between different AI agents.

Key Features:

  • Standard Communication: JSON-RPC based messaging with support for both streaming and non-streaming responses
  • Agent Discovery: Automatic agent card generation at /.well-known/agent-card.json describing agent capabilities, skills, and endpoints
  • Rich Interactions: Built-in support for tasks, status updates, and artifact sharing between agents

Agentor makes it easy to serve any agent as an A2A protocol.

from agentor import Agentor

agent = Agentor(
    name="Weather Agent",
    model="gpt-5-mini",
    tools=["get_weather"],
)

# Serve agent with A2A protocol enabled automatically
agent.serve(port=8000)
# Agent card available at: http://localhost:8000/.well-known/agent-card.json

Any agent served with agent.serve() automatically becomes A2A-compatible with standardized endpoints for message sending, streaming, and task management.

📖 Learn more

🤝 Contributing

We'd love your help making Agentor even better! Please read our Contributing Guidelines and Code of Conduct.

📄 License

Apache 2.0 License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_celestoai_agentor-0.0.23.tar.gz (83.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iflow_mcp_celestoai_agentor-0.0.23-py3-none-any.whl (62.0 kB view details)

Uploaded Python 3

File details

Details for the file iflow_mcp_celestoai_agentor-0.0.23.tar.gz.

File metadata

  • Download URL: iflow_mcp_celestoai_agentor-0.0.23.tar.gz
  • Upload date:
  • Size: 83.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_celestoai_agentor-0.0.23.tar.gz
Algorithm Hash digest
SHA256 72d0aed4dddf3f83a47b55e840763700066221e838dc81ae5679dbd96480400f
MD5 80a5469df9b578e104cc8889d4919bd7
BLAKE2b-256 25fef518ed392e9b554def60a84ab424be1116a41e536d7d6b58f49b80272a27

See more details on using hashes here.

File details

Details for the file iflow_mcp_celestoai_agentor-0.0.23-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_celestoai_agentor-0.0.23-py3-none-any.whl
  • Upload date:
  • Size: 62.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.2 {"installer":{"name":"uv","version":"0.10.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_celestoai_agentor-0.0.23-py3-none-any.whl
Algorithm Hash digest
SHA256 e2af74dadf83b668165ec51d1100447dd69d389f48a67026a98280d83df2a6e6
MD5 407d43bfdbf0748fb8987557acdf355c
BLAKE2b-256 5c2eaf78370d86655bb73b261b1aba3086addeea904afb1c64ab57e5873784ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page