Mainframe-Orchestra is a lightweight, open-source agentic framework for building LLM based pipelines and self-orchestrating multi-agent teams
Project description
Orchestra
Cognitive Architectures for Multi-Agent Teams.
Table of Contents
- Orchestra
Overview
Mainframe-Orchestra is a lightweight, open-source agentic framework for building LLM-based pipelines and multi-agent teams. It implements a unique approach to agent orchestration that goes beyond simple routing, enabling complex workflows.
Key Features
- Modularity: Modular architecture for easy building, extension, and integration
- Agent Orchestration: Agents can act as both executors and conductors, enabling dynamic task decomposition and coordination among agents
- Phased Task Execution: Reduces cognitive load on LLMs through structured thinking patterns
- Tool Integration: Simple docstring-based tool definitions without complex JSON schemas
- Streaming Support: Real-time output streaming with both sync and async support
- Built-in Fallbacks: Graceful handling of LLM failures with configurable fallback chains
- Unified LLM Interface: Powered by LiteLLM for integration with 100+ language models from all major providers
- Universal Function Calling: Orchestra's Task system enables function calling for any model, including open-source models that don't natively support function calling
Installation
Install Orchestra using pip:
pip install mainframe-orchestra
Quick Start
Here's a simple example to get you started:
from mainframe_orchestra import Agent, Task, OpenaiModels, WebTools, set_verbosity
set_verbosity(1)
research_agent = Agent(
agent_id="research_assistant_1",
role="research assistant",
goal="answer user queries",
llm=OpenaiModels.gpt_4_1,
tools={WebTools.exa_search}
)
def research_task(topic):
return Task.create(
agent=research_agent,
instruction=f"Use your exa search tool to research {topic} and explain it in a way that is easy to understand.",
)
result = research_task("quantum computing")
print(result)
Core Components
Tasks: Discrete units of work
Agents: Personas that perform tasks and can be assigned tools
Tools: Wrappers around external services or specific functionalities
Language Model Interfaces: Consistent interface for various LLM providers
Supported Language Models and Providers
Orchestra uses LiteLLM to support a wide range of language models from multiple providers:
OpenAI
GPT-4.5-preview, GPT-4o, GPT-4o Mini, & Custom defined models
Orchestra supports customizing the OpenAI base URL, allowing you to connect to OpenAI-compatible APIs or proxies:
# Method 1: Set via environment variable
import os
os.environ["OPENAI_BASE_URL"] = "https://your-custom-endpoint.com/v1"
# Method 2: Set globally for all OpenAI requests
from mainframe_orchestra.llm import OpenaiModels
OpenaiModels.set_base_url("https://your-custom-endpoint.com/v1")
# Method 3: Set for a specific request
response, error = await OpenaiModels.gpt_4_1(
messages=[{"role": "user", "content": "Hello"}],
base_url="https://your-custom-endpoint.com/v1"
)
Anthropic
Claude 3 Haiku, Claude 3 Sonnet, Claude 3 Opus, Claude 3.5 Sonnet, Claude 3.7 Sonnet, & Custom defined models
Openrouter
GPT-4 Turbo, Claude 3 Opus, Mixtral 8x7B, Llama 3.1 405B, & Custom defined models
Ollama
Mistral, Mixtral, Llama 3.1, Qwen, Gemma, & Custom defined models
Groq
Mixtral 8x7B, Llama 3, Llama 3.1, Gemma, & Custom defined models
TogetherAI
Custom defined models
Gemini
Gemini 1.5 Flash, Gemini 1.5 Flash 8B, Gemini 1.5 Pro, & Custom defined models
Deepseek
Deepseek Reasoner, Deepseek Chat, & Custom defined models
Each provider is accessible through a dedicated class (e.g., OpenaiModels, AnthropicModels, etc.) with methods corresponding to specific models. This structure allows for switching between models and providers, enabling users to leverage the most suitable LLM for their tasks.
Tools
Mainframe-Orchestra comes with a comprehensive set of built-in tools that provide various functionalities for your agents. Here's an overview of the available tool categories:
Built-in Tools
Data & File Operations
- FileTools: Read and write CSV, JSON, XML, and other file formats
- TextSplitters: Tools for chunking and splitting text documents
- EmbeddingsTools: Generate embeddings for text content
- FaissTools: Vector storage and similarity search operations
- PineconeTools: Vector database operations with Pinecone
Web & API Integration
- WebTools: Web scraping, searches, and data retrieval (Serper, Exa, etc.)
- WikipediaTools: Search and retrieve Wikipedia content
- AmadeusTools: Flight information and travel data
- GitHubTools: GitHub repository operations and content access
- LinearTools: Linear API-based tools for creating, updating, and retrieving tasks
Financial & Data Analysis
- FredTools: Federal Reserve Economic Data access
- CalculatorTools: Date, time, and mathematical calculations
- MatplotlibTools: Data visualization and plotting
Note:
YahooFinanceToolswas deprecated in v1.0.0 due to upstream API instability. For financial data, consider usingFredToolsfor economic data or web tools.
Media & Content
- AudioTools: Audio processing and manipulation
- TextToSpeechTools: Text-to-speech conversion using ElevenLabs and OpenAI APIs
- WhisperTools: Audio transcription and translation using OpenAI's Whisper API
Integration Tools
- LangchainTools: Wrapper for accessing the Langchain tools ecosystem
Custom Tools
Mainframe-Orchestra supports creating custom tools to extend functionality beyond the built-in tools. Custom tools can be implemented either as static methods or as class instance methods for more complex operations. Here's a basic example:
import numpy as np
from typing import List, Union
class NumpyTools:
@staticmethod
def array_mean(arr: Union[List[float], np.ndarray]) -> Union[float, str]:
"""
Calculate the mean of a given array.
Args:
arr (Union[List[float], np.ndarray]): Input array or list of numbers.
Returns:
Union[float, str]: The mean of the input array as a float, or an error message as a string.
"""
try:
arr = np.array(arr, dtype=float)
if arr.size == 0:
return "Error: Input array is empty."
return float(np.mean(arr))
except TypeError as e:
return f"Error: Invalid input type. Expected a list or numpy array of numbers. Details: {e}"
except Exception as e:
return f"Error: An unexpected error occurred: {e}"
Tools can be assigned to agents during initialization:
agent = Agent(
agent_id="my_agent",
tools={NumpyTools.array_mean, WebTools.exa_search}
)
For detailed documentation on creating custom tools, including best practices for error handling and API integration, visit our Custom Tools Documentation.
Multi-Agent Teams
Mainframe-Orchestra allows you to create multi-agent teams that can use tools to complete a series of tasks. Here's an example of a GitHub/Linear integration team that automatically manages issue tracking:
from mainframe_orchestra import Task, Agent, Conduct, OpenaiModels, GitHubTools, LinearTools
# Initialize tool instances
linear_tools = LinearTools()
# Create specialized agents
github_agent = Agent(
agent_id="github_agent",
role="GitHub Issue Analyzer",
goal="Analyze GitHub issues, pull requests, and repository activity",
attributes="You have expertise in analyzing code repositories and GitHub workflows.",
llm=OpenaiModels.gpt_4_1,
tools=[GitHubTools.get_issue, GitHubTools.list_issues, GitHubTools.get_pull_request]
)
linear_agent = Agent(
agent_id="linear_agent",
role="Linear Project Manager",
goal="Manage Linear tickets and workflow states",
attributes="You have expertise in project management and Linear workflows.",
llm=OpenaiModels.gpt_4_1,
tools=[
linear_tools.get_team_issues,
linear_tools.search_issues,
linear_tools.create_issue,
linear_tools.update_issue_status
]
)
coordinator_agent = Agent(
agent_id="coordinator_agent",
role="Integration Coordinator",
goal="Coordinate between GitHub and Linear for seamless issue tracking",
attributes="You have expertise in coordinating development workflows across platforms.",
llm=OpenaiModels.gpt_4_1,
tools=[Conduct.conduct_tool(github_agent, linear_agent)]
)
def integration_task(github_issue_url):
return Task.create(
agent=coordinator_agent,
instruction=f"Analyze the GitHub issue at {github_issue_url} and create or update the corresponding Linear ticket with relevant details and status."
)
# Example usage
result = integration_task("https://github.com/owner/repo/issues/123")
print(result)
Note: this example requires GitHub and Linear API keys to be set in your environment variables.
Conduct and Compose
The Conduct and Compose tools are used to orchestrate and compose agents. Conduct is used to actually instruct and orchestrate a team of agents, while Compose is used in addition to the Conduct tool to enrich the orchestration process with additional complexity as a preprocessing step. It's important to note that Conduct is required for the orchestration process to work, while Compose is an optional additional tool that can be used to enrich the orchestration process.
By combining agents, tasks, tools, and language models, you can create a wide range of workflows, from simple pipelines to complex multi-agent teams.
MCP Integration
- MCPOrchestra: Adapter for integrating with Model Context Protocol (MCP) servers, allowing agents to use any MCP-compatible toolkits / servers
- Connect to FastMCP, Playwright, Slack, Filesystem, and other MCP-compatible servers
- List available tools from an MCP server
- Convert external tools into Orchestra-compatible callables for agents to use
For documentation on MCP integration, visit our MCP Integration Guide.
Streaming Support
Orchestra supports streaming of LLM responses. When using streaming, you need to use an async approach:
import asyncio
from mainframe_orchestra import Agent, Task, OpenaiModels, WebTools, set_verbosity
set_verbosity(1)
research_agent = Agent(
agent_id="research_assistant_1",
role="research assistant",
goal="answer user queries",
llm=OpenaiModels.gpt_4_1,
tools={WebTools.exa_search}
)
async def research_task_streaming():
# Create the task and await it
task = await Task.create(
agent=research_agent,
instruction="Use your exa search tool to research quantum computing and explain it in a way that is easy to understand.",
stream=True
)
# Process the streaming output
async for chunk in task:
print(chunk, end="", flush=True)
print() # Add a newline at the end
# Run the async function
if __name__ == "__main__":
asyncio.run(research_task_streaming())
The key points for streaming:
- Make your function async
- Set
stream=Truein the Task.create call - Await the Task.create() call to get the streaming task
- Use
async forto process the streaming chunks - Run the async function with asyncio.run()
Documentation
For more detailed information, tutorials, and advanced usage, visit our documentation.
Contributing
Mainframe-Orchestra depends on and welcomes community contributions! Please review contribution guidelines and submit a pull request if you'd like to contribute.
License
Mainframe-Orchestra is released under the Apache License 2.0. See the LICENSE file for details.
Acknowledgments
Orchestra is a fork and further development of TaskflowAI.
Support
For issues or questions, please file an issue on our GitHub repository issues page.
⭐️ If you find Mainframe-Orchestra helpful, consider giving it a star!
Happy building!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mainframe_orchestra-1.0.0.tar.gz.
File metadata
- Download URL: mainframe_orchestra-1.0.0.tar.gz
- Upload date:
- Size: 79.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.13.5 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e760889f37f8cc572808f87a1bf7a86c607e673ac0c70003ad79d5fe6e9b450
|
|
| MD5 |
a8b5fa37faefd499db4cb9991c29aa1c
|
|
| BLAKE2b-256 |
6192f2e669746ea0885562f28387f0f4404764cb07cd94ab77d24214f5ec0345
|
File details
Details for the file mainframe_orchestra-1.0.0-py3-none-any.whl.
File metadata
- Download URL: mainframe_orchestra-1.0.0-py3-none-any.whl
- Upload date:
- Size: 90.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.13.5 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6cd7a3a8fc0d36617979b46ac97f46662a3a3e562b5a5cb1d9878e4515234bb5
|
|
| MD5 |
51406aebdc081b7386a67b61d686dead
|
|
| BLAKE2b-256 |
665d10aa386afecbddd01a65bc819a00983c3a78159c482e83e0c3099a8797b2
|