Ceylon: A Rust-based agent mesh framework for building local and distributed AI agent systems
Project description
Ceylon Python Bindings
Python bindings for Ceylon, a Rust-based agent mesh framework for building local and distributed AI agent systems.
Overview
Ceylon provides a unified API for creating agent-based systems that work seamlessly in both local (in-memory) and distributed (network-based) scenarios. The Python bindings allow you to build sophisticated agent systems using clean Python code while leveraging Rust's performance and safety.
Features
- ๐ค Custom Agents: Create agents with synchronous message handlers
- ๐ง LLM Integration: Built-in support for LLM agents (Ollama, OpenAI, etc.)
- โก Async Support: Concurrent LLM operations with
send_message_async() - ๐ ๏ธ Actions/Tools: Define custom actions with automatic schema generation
- ๐ Mesh Architecture: Local and distributed agent communication
- ๐ Metrics & Monitoring: Built-in metrics for performance, costs, and errors
- ๐ Pythonic API: Fluent builder patterns and decorators
Installation
cd bindings/python
pip install -e .
Quick Start
Simple Agent
from ceylon import Agent, PyLocalMesh
class EchoAgent(Agent):
def on_message(self, message, context=None):
print(f"Received: {message}")
return f"Echo: {message}"
# Create mesh and agent
mesh = PyLocalMesh("my_mesh")
agent = EchoAgent("echo")
mesh.add_agent(agent)
# Send message
mesh.send_to("echo", "Hello!")
LLM Agent (Synchronous)
from ceylon import LlmAgent
# Create and configure
agent = LlmAgent("assistant", "ollama::gemma3:latest")
agent.with_system_prompt("You are a helpful assistant.")
agent.with_temperature(0.7)
agent.with_max_tokens(100)
agent.build()
# Send message
response = agent.send_message("What is 2+2?")
print(response)
LLM Agent (Async)
import asyncio
from ceylon import LlmAgent
async def main():
agent = LlmAgent("assistant", "ollama::gemma3:latest")
agent.build()
# Concurrent queries
tasks = [
agent.send_message_async("What is 2+2?"),
agent.send_message_async("What is 3+3?"),
agent.send_message_async("What is 5+5?"),
]
responses = await asyncio.gather(*tasks)
for response in responses:
print(response)
asyncio.run(main())
Custom Actions
from ceylon import Agent
class CalculatorAgent(Agent):
def __init__(self, name):
super().__init__(name)
@Agent.action(name="add")
def add(self, a: int, b: int) -> int:
"""Add two numbers"""
return a + b
@Agent.action(name="multiply")
def multiply(self, a: int, b: int) -> int:
"""Multiply two numbers"""
return a * b
# Create agent
agent = CalculatorAgent("calc")
# Invoke actions
result = agent.tool_invoker.invoke("add", '{"a": 5, "b": 3}')
print(result) # 8
Metrics and Monitoring
Ceylon includes built-in metrics collection for monitoring performance, costs, and errors:
import ceylonai_next as ceylon
# Run your agents...
# mesh.send_to("agent", "message")
# Get metrics snapshot
metrics = ceylon.get_metrics()
# Available metrics
print(f"Messages processed: {metrics['message_throughput']}")
print(f"Avg latency: {metrics['avg_message_latency_us']/1000:.2f} ms")
print(f"LLM tokens used: {metrics['total_llm_tokens']}")
print(f"LLM cost: ${metrics['total_llm_cost_us']/1_000_000:.4f}")
print(f"Memory hit rate: {metrics['memory_hits']/(metrics['memory_hits']+metrics['memory_misses'])*100:.1f}%")
print(f"Errors: {metrics['errors']}")
Key Metrics:
message_throughput- Total messages processedavg_message_latency_us- Average message latency (microseconds)avg_agent_execution_time_us- Average agent execution time (microseconds)total_llm_tokens- Total LLM tokens consumedavg_llm_latency_us- Average LLM API latency (microseconds)total_llm_cost_us- Total LLM cost in micro-dollars ($1 = 1,000,000 ฮผ$)memory_hits/memory_misses/memory_writes- Memory operation countserrors- Dictionary of error types and counts
See examples/README_METRICS.md for detailed examples.
Examples
Example scripts are located in the examples/ directory, and tests are in the tests/ directory.
Basic Examples
-
examples/demo_simple_agent.py- Basic agent with synchronous message handlingpython examples/demo_simple_agent.py -
examples/demo_conversation.py- LLM agent conversation (synchronous)python examples/demo_conversation.py
Async Examples
-
examples/demo_async_llm.pyโญ NEW - Concurrent LLM operations (recommended)python examples/demo_async_llm.pyDemonstrates:
- Concurrent queries with
asyncio.gather() - Streaming responses with
asyncio.as_completed() - Batch processing with concurrency control
- Error handling in async contexts
- Concurrent queries with
-
examples/demo_async_agent.pyโจ NEW - Async message handlers and actionspython examples/demo_async_agent.pyDemonstrates:
- Async
on_message()handlers - Async action execution
- Thread-local event loop handling
- Async
Metrics Examples
-
examples/metrics_quickstart.pyโก NEW - Quick start guide for metricspython examples/metrics_quickstart.pyDemonstrates:
- Basic metrics collection with
get_metrics() - Retrieving and displaying metrics snapshots
- Basic metrics collection with
-
examples/metrics_demo.py๐ NEW - Comprehensive metrics demopython examples/metrics_demo.pyDemonstrates:
- Message throughput and latency tracking
- Memory cache hit rate monitoring
- Error tracking and reporting
- Continuous monitoring patterns
See examples/README_METRICS.md for complete metrics documentation.
Test Files
All test files are located in the tests/ directory:
tests/test_actions.py- Action system teststests/test_agent_messages.py- Agent messaging teststests/test_async_agent.py- Async functionality teststests/test_advanced_features.py- Advanced featurestests/test_bindings.py- Basic bindings teststests/test_decorator.py- Action decorator teststests/test_llm_agent.py- LLM agent teststests/test_mesh.py- Mesh operations teststests/test_ollama_simple.py- Ollama connectivity teststests/test_response.py- Response handling tests
API Reference
Core Classes
Agent
Base class for creating custom agents.
class MyAgent(Agent):
def on_message(self, message: str, context=None) -> str:
"""Handle incoming messages (synchronous)"""
return "response"
@Agent.action(name="my_action")
def custom_action(self, param: str) -> str:
"""Custom action callable by other agents"""
return f"Processed: {param}"
Methods:
name() -> str- Get agent namesend_message(target: str, message: str)- Send message to another agenton_message(message: str, context=None)- Override to handle messages
Decorators:
@Agent.action(name="action_name")- Register a custom action
LlmAgent
LLM-powered agent with fluent builder API.
agent = LlmAgent("name", "ollama::model_name")
agent.with_system_prompt("...")
agent.with_temperature(0.7)
agent.with_max_tokens(100)
agent.build()
Builder Methods:
with_system_prompt(prompt: str)- Set system promptwith_temperature(temp: float)- Set temperature (0.0-1.0)with_max_tokens(max: int)- Set max tokensbuild()- Finalize configuration
Message Methods:
send_message(message: str) -> str- Synchronous LLM callsend_message_async(message: str) -> Awaitable[str]- Async LLM call โ
PyLocalMesh
Local in-memory mesh for agent communication.
mesh = PyLocalMesh("mesh_name")
mesh.add_agent(agent)
mesh.send_to("agent_name", "message")
Methods:
add_agent(agent: Agent)- Register an agentsend_to(target: str, payload: str)- Send message to agent
PyAction
Custom action definition with schema generation.
from ceylon import PyAction
action = PyAction(
name="my_action",
description="Action description",
schema='{"type": "object", ...}'
)
PyToolInvoker
Execute registered actions.
invoker = agent.tool_invoker
result = invoker.invoke("action_name", '{"param": "value"}')
Async Support
โ Fully Supported Async Features
1. send_message_async() on LlmAgent
- Fully functional and production-ready
- Supports concurrent execution with asyncio
- Proper error propagation
async def example():
agent = LlmAgent("agent", "ollama::model")
agent.build()
# Concurrent queries
tasks = [agent.send_message_async(q) for q in queries]
results = await asyncio.gather(*tasks)
2. Async on_message() handlers โจ NEW
- Now fully supported with thread-local event loops
- Can use async/await in custom agent message handlers
- Supports async actions as well
class MyAgent(Agent):
async def on_message(self, message, context=None):
await asyncio.sleep(0.1) # Async operations work!
return f"Processed: {message}"
For detailed async examples, see ASYNC_EXAMPLES.md and ASYNC_STATUS.md.
Documentation
- ASYNC_EXAMPLES.md - Comprehensive async examples guide
- ASYNC_STATUS.md - Current status of async features
- examples/README_METRICS.md - Metrics collection and monitoring guide
- Ceylon Docs - Full framework documentation
Requirements
- Python 3.8+
- Rust toolchain (for building from source)
- Ollama (for LLM examples)
Installing Ollama
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Start Ollama
ollama serve
# Pull a model
ollama pull gemma3:latest
Development
Building from Source
cd bindings/python
cargo build --release
pip install -e .
Running Tests
cd bindings/python
python -m pytest tests/
Or run individual tests:
python tests/test_actions.py
python tests/test_agent_messages.py
python tests/test_llm_agent.py
Architecture
Ceylon uses a mesh architecture where agents communicate through a unified mesh abstraction:
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Application Code โ
โ (Python/Rust) โ
โโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Agent Mesh (Rust) โ
โ โโโโโโโโ โโโโโโโโ โโโโโโโโ โ
โ โAgent1โ โAgent2โ โAgent3โ โ
โ โโโโฌโโโโ โโโโฌโโโโ โโโโฌโโโโ โ
โ โโโโโโโโโโโดโโโโโโโโโโ โ
โ Message Routing & Delivery โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Local (In-Memory) or Distributed โ
โ (Network) Communication โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Key Concepts:
- Agents: Autonomous entities that process messages and execute actions
- Mesh: Communication layer that routes messages between agents
- Actions: Callable functions/tools that agents can invoke
- Messages: Data exchanged between agents
Contributing
Contributions are welcome! Please:
- Check existing issues or create a new one
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Submit a pull request
License
See the main Ceylon repository for license information.
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Docs: Ceylon Documentation
Roadmap
- Full async/await support for message handlers
- Additional LLM provider integrations
- Distributed mesh implementation
- Agent lifecycle hooks
- Advanced debugging tools
- Performance monitoring
Status: Alpha - API may change
For more information about Ceylon, visit the main repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ceylonai_next-0.2.6.tar.gz.
File metadata
- Download URL: ceylonai_next-0.2.6.tar.gz
- Upload date:
- Size: 318.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b42189745214f26f7ff59d1b3862b99f3ccc935d9405aed569f883f30af600cf
|
|
| MD5 |
5f4f7f19e23fd3614217322ed486e9d9
|
|
| BLAKE2b-256 |
2c977afd87734e9d775326d05c18a5f3fd1e8cd2a00bbf1c3dea8b7e46811f30
|
Provenance
The following attestation bundles were made for ceylonai_next-0.2.6.tar.gz:
Publisher:
pypi-publish.yml on ceylonai/next-processor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ceylonai_next-0.2.6.tar.gz -
Subject digest:
b42189745214f26f7ff59d1b3862b99f3ccc935d9405aed569f883f30af600cf - Sigstore transparency entry: 745249745
- Sigstore integration time:
-
Permalink:
ceylonai/next-processor@a6321b84e33df1b499858141c29268ea6577e1ef -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/ceylonai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@a6321b84e33df1b499858141c29268ea6577e1ef -
Trigger Event:
push
-
Statement type:
File details
Details for the file ceylonai_next-0.2.6-cp39-abi3-win_amd64.whl.
File metadata
- Download URL: ceylonai_next-0.2.6-cp39-abi3-win_amd64.whl
- Upload date:
- Size: 5.2 MB
- Tags: CPython 3.9+, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a879d1531698d9f0adab2a1ae2be5a98bdab96014f069771e6d60cc9d4cf2ea
|
|
| MD5 |
5759028d12f639cd6b1a2861b817ca6b
|
|
| BLAKE2b-256 |
77409eade58a4a1c54dfcb5c9230cf79011cd1c2e963747f0bd77290aab6aa7d
|
Provenance
The following attestation bundles were made for ceylonai_next-0.2.6-cp39-abi3-win_amd64.whl:
Publisher:
pypi-publish.yml on ceylonai/next-processor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ceylonai_next-0.2.6-cp39-abi3-win_amd64.whl -
Subject digest:
3a879d1531698d9f0adab2a1ae2be5a98bdab96014f069771e6d60cc9d4cf2ea - Sigstore transparency entry: 745249780
- Sigstore integration time:
-
Permalink:
ceylonai/next-processor@a6321b84e33df1b499858141c29268ea6577e1ef -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/ceylonai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@a6321b84e33df1b499858141c29268ea6577e1ef -
Trigger Event:
push
-
Statement type:
File details
Details for the file ceylonai_next-0.2.6-cp39-abi3-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: ceylonai_next-0.2.6-cp39-abi3-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 7.5 MB
- Tags: CPython 3.9+, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fd3dcb498580a990e9ce938968747f9b21c74e0995efb88271a25397fe7ecaac
|
|
| MD5 |
84bd10b6be898b6381947665ba540c7e
|
|
| BLAKE2b-256 |
22ccddca53e896584a08e450d036e0f23d4b1795cae945f5545f041c93dcfd0a
|
Provenance
The following attestation bundles were made for ceylonai_next-0.2.6-cp39-abi3-manylinux_2_34_x86_64.whl:
Publisher:
pypi-publish.yml on ceylonai/next-processor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ceylonai_next-0.2.6-cp39-abi3-manylinux_2_34_x86_64.whl -
Subject digest:
fd3dcb498580a990e9ce938968747f9b21c74e0995efb88271a25397fe7ecaac - Sigstore transparency entry: 745249797
- Sigstore integration time:
-
Permalink:
ceylonai/next-processor@a6321b84e33df1b499858141c29268ea6577e1ef -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/ceylonai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@a6321b84e33df1b499858141c29268ea6577e1ef -
Trigger Event:
push
-
Statement type:
File details
Details for the file ceylonai_next-0.2.6-cp39-abi3-macosx_11_0_arm64.whl.
File metadata
- Download URL: ceylonai_next-0.2.6-cp39-abi3-macosx_11_0_arm64.whl
- Upload date:
- Size: 4.7 MB
- Tags: CPython 3.9+, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0587d9872a137941dffee033f67f27754248f7afd8213b784d4353f27e04b04c
|
|
| MD5 |
477abf72b3afdec7b5b0263f58257f75
|
|
| BLAKE2b-256 |
ef970207a097b926950d4ca17e7501f88cdfa2802803a78a22a88704ef4a4e2c
|
Provenance
The following attestation bundles were made for ceylonai_next-0.2.6-cp39-abi3-macosx_11_0_arm64.whl:
Publisher:
pypi-publish.yml on ceylonai/next-processor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ceylonai_next-0.2.6-cp39-abi3-macosx_11_0_arm64.whl -
Subject digest:
0587d9872a137941dffee033f67f27754248f7afd8213b784d4353f27e04b04c - Sigstore transparency entry: 745249761
- Sigstore integration time:
-
Permalink:
ceylonai/next-processor@a6321b84e33df1b499858141c29268ea6577e1ef -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/ceylonai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@a6321b84e33df1b499858141c29268ea6577e1ef -
Trigger Event:
push
-
Statement type:
File details
Details for the file ceylonai_next-0.2.6-cp39-abi3-macosx_10_12_x86_64.whl.
File metadata
- Download URL: ceylonai_next-0.2.6-cp39-abi3-macosx_10_12_x86_64.whl
- Upload date:
- Size: 4.8 MB
- Tags: CPython 3.9+, macOS 10.12+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
60c140aff8f0d97c80da92ce123dc362a99582cbf1208f5b38d7cacb4e5cc72c
|
|
| MD5 |
9f700e85373af6db3fe5aedae4de029e
|
|
| BLAKE2b-256 |
c23e50fe33d725fbb79e5226061e43df5f9d0c6fab4dc5cd9796daf161400a66
|
Provenance
The following attestation bundles were made for ceylonai_next-0.2.6-cp39-abi3-macosx_10_12_x86_64.whl:
Publisher:
pypi-publish.yml on ceylonai/next-processor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ceylonai_next-0.2.6-cp39-abi3-macosx_10_12_x86_64.whl -
Subject digest:
60c140aff8f0d97c80da92ce123dc362a99582cbf1208f5b38d7cacb4e5cc72c - Sigstore transparency entry: 745249812
- Sigstore integration time:
-
Permalink:
ceylonai/next-processor@a6321b84e33df1b499858141c29268ea6577e1ef -
Branch / Tag:
refs/tags/v0.2.6 - Owner: https://github.com/ceylonai
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@a6321b84e33df1b499858141c29268ea6577e1ef -
Trigger Event:
push
-
Statement type: