A high-level orchestration language for AI models and system commands
Project description
Mairex
An experimental orchestration language for AI models and system commands
Mairex is an orchestration language that allows you to coordinate AI models, shell commands, and data flows using a JSON-based syntax with specialized operators. It's designed for developers who want to prototype AI workflows and automation scripts.
⚠️ Alpha Status: Mairex is in early development (v0.9.2). Expect bugs, missing features, and potential breaking changes in future versions.
Current Limitations
- Sequential execution only (parallel execution syntax exists but runs sequentially)
- Basic error handling
- Limited debugging capabilities
- Many planned features not yet implemented
- Documentation may be ahead of implementation in some areas
What It Does
Mairex lets you write scripts that combine shell commands and AI model calls in a declarative way. For example, you can download websites, process them with AI models, and save outputs to files - all coordinated through a single .jsom file.
✨ Features
- 🤖 Native AI Integration - Call Ollama, OpenAI, Anthropic, Gemini, and XAI models directly
- 🔄 Parallel Execution - Run multiple shells and AI models concurrently
- 🔗 Chainable Operations - Flow data between commands, files, and AI models
- 📦 Variable Scoping - Custom and AI-specific variable management
- 🎯 JSON-Based Syntax - Familiar structure with powerful extensions
- 🛠️ Shell Integration - Execute any terminal command with persistent shell sessions
- 🧩 Script-Level Arguments - Pass runtime parameters to scripts from the command line
🚀 Quick Start
Installation
pip install mairex
Your First Mairex Script
Create a file hello.jsom:
{
"greeting": {
"set_input": [
"~| A&I <&¤S- 'World' |~"
],
"set_prompt": [
"~| A&P <&¤S- 'Say hello to the input' |~"
],
"call_ai": [
"~| A&O -$S> |>echo '<$>'<| |~"
]
}
}
Run it:
mairex hello.jsom
What this does:
- Sets AI input to "World"
- Sets AI prompt to "Say hello to the input"
- Calls the AI model and echoes the response
📚 Core Concepts
JSOM Files
Mairex scripts use .jsom files (JSON + Mairex). They follow standard JSON syntax with one rule:
All leaf nodes must be arrays:
{
"step": {
"action": ["value"]
}
}
NOT:
{
"step": {
"action": "value"
}
}
Instructions
Instructions are declared between ~| |~ specifiers:
["~| |>echo 'Hello'<| |~"]
Shell commands go between |> <|:
["~| |>ls -la<| |~"]
Variables
Custom Variables (shared across shells, scoped to function):
["~| VAR&V <&¤S- 'my value' |~"]
AI Variables (shell-specific, persistent across tree levels):
["~| A&I <&¤S- 'AI input' |~"]
Data Flow
Left to right:
["~| |>echo 'output'<| -&#> FILE&V -€S> result.txt |~"]
Right to left:
["~| FILE&V <&€- result.txt <&#- |>cat file.txt<| |~"]
Parallel Execution
Separate shells (parallel):
{
"parallel_tasks": [
"~| |>echo 'Shell 1'<| |~",
"~| |>echo 'Shell 2'<| |~",
"~| |>echo 'Shell 3'<| |~"
]
}
Each array element runs in its own independent shell session.
Script-Level Arguments
Pass values from the command line into your script using <ł[N]T> placeholders:
mairex analyze.jsom report.txt llama3
{
"inputs": {
"file": ["<ł[0]S>"],
"model": ["<ł[1]S>"]
},
"analyze": {
"setup": [
"~| MODEL_NAME&V <S- inputs.model[0].&= |~",
"~| A&M <S- MODEL_NAME&V |~"
],
"load": [
"~| FILE_NAME&V <S- inputs.file[0].&= |~",
"~| FILE_NAME&V -$S> |>cat '<$>'<| -&#> A&I |~"
],
"save": [
"~| A&O -€S> summary.txt |~"
]
}
}
Place placeholders as normal strings inside the JSON block, then access them inside instructions using standard JSON value access (.&=). Supports S (String), I (Integer), and L (List) types.
🎓 Learn More
- Syntax Reference - Complete language specification
- Tutorial - Step-by-step guide
- Examples - Real-world use cases
🔧 Requirements
- Python 3.8+
- Dependencies (auto-installed):
ollama- Local AI model supportlitellm- Multi-provider AI API supportlizard- Code analysis for function extractionwhats_that_code- Programming language detection
🌐 AI Provider Setup
Using Ollama (Local Models)
- Install Ollama: https://ollama.ai
- Pull a model:
ollama pull llama3 - No API keys needed - works out of the box!
Using Cloud AI Providers
Create API_keys.json in your working directory:
{
"openai": "sk-your-key-here",
"anthropic": "sk-ant-your-key-here",
"gemini": "your-gemini-key",
"xai": "your-xai-key"
}
Set the provider in your JSOM file:
["~| A&S <&¤S- 'openai' |~"]
["~| A&M <&¤S- 'gpt-4o' |~"]
📄 License
MIT License
Development Status
This is an early alpha release. The project is not currently accepting outside contributions. Bug reports and feedback are welcome via GitHub issues.
An experimental tool for AI orchestration
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mairex-0.9.3.tar.gz.
File metadata
- Download URL: mairex-0.9.3.tar.gz
- Upload date:
- Size: 30.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0e54d91707fc21bcb0c8710858f255f5fd142e4a595d921ed24e5aa9d80594e2
|
|
| MD5 |
28034eaae4a59c8746ab565b63e8d3ef
|
|
| BLAKE2b-256 |
ac8438fd584abfee3b132cf520117bd63a913bcf0dcc4d9ed2f02736200f6ad4
|
File details
Details for the file mairex-0.9.3-py3-none-any.whl.
File metadata
- Download URL: mairex-0.9.3-py3-none-any.whl
- Upload date:
- Size: 17.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a68c8bf94c022a1c99794bdea33bb121dac8ead813d5c58ad6e76bae271cb52
|
|
| MD5 |
581a93e392078dff5988a9492241a5c2
|
|
| BLAKE2b-256 |
1c26e5141046da7cae66eefd269677f3cf872a746907ff5036ed48b5eee19792
|