Skip to main content

A high-level orchestration language for AI models and system commands

Project description

Mairex

An experimental orchestration language for AI models and system commands

Mairex is an orchestration language that allows you to coordinate AI models, shell commands, and data flows using a JSON-based syntax with specialized operators. It's designed for developers who want to prototype AI workflows and automation scripts.

⚠️ Alpha Status: Mairex is in early development (v0.9.2). Expect bugs, missing features, and potential breaking changes in future versions.

Current Limitations

  • Sequential execution only (parallel execution syntax exists but runs sequentially)
  • Basic error handling
  • Limited debugging capabilities
  • Many planned features not yet implemented
  • Documentation may be ahead of implementation in some areas

What It Does

Mairex lets you write scripts that combine shell commands and AI model calls in a declarative way. For example, you can download websites, process them with AI models, and save outputs to files - all coordinated through a single .jsom file.

✨ Features

  • 🤖 Native AI Integration - Call Ollama, OpenAI, Anthropic, Gemini, and XAI models directly
  • 🔄 Parallel Execution - Run multiple shells and AI models concurrently
  • 🔗 Chainable Operations - Flow data between commands, files, and AI models
  • 📦 Variable Scoping - Custom and AI-specific variable management
  • 🎯 JSON-Based Syntax - Familiar structure with powerful extensions
  • 🛠️ Shell Integration - Execute any terminal command with persistent shell sessions
  • 🧩 Script-Level Arguments - Pass runtime parameters to scripts from the command line

🚀 Quick Start

Installation

pip install mairex

Your First Mairex Script

Create a file hello.jsom:

{
  "greeting": {
    "set_input": [
      "~| A&I <&¤S- 'World' |~"
    ],
    "set_prompt": [
      "~| A&P <&¤S- 'Say hello to the input' |~"
    ],
    "call_ai": [
      "~| A&O -$S> |>echo '<$>'<| |~"
    ]
  }
}

Run it:

mairex hello.jsom

What this does:

  1. Sets AI input to "World"
  2. Sets AI prompt to "Say hello to the input"
  3. Calls the AI model and echoes the response

📚 Core Concepts

JSOM Files

Mairex scripts use .jsom files (JSON + Mairex). They follow standard JSON syntax with one rule:

All leaf nodes must be arrays:

{
  "step": {
    "action": ["value"]
  }
}

NOT:

{
  "step": {
    "action": "value"
  }
}

Instructions

Instructions are declared between ~| |~ specifiers:

["~| |>echo 'Hello'<| |~"]

Shell commands go between |> <|:

["~| |>ls -la<| |~"]

Variables

Custom Variables (shared across shells, scoped to function):

["~| VAR&V <&¤S- 'my value' |~"]

AI Variables (shell-specific, persistent across tree levels):

["~| A&I <&¤S- 'AI input' |~"]

Data Flow

Left to right:

["~| |>echo 'output'<| -&#> FILE&V -€S> result.txt |~"]

Right to left:

["~| FILE&V <&€- result.txt <&#- |>cat file.txt<| |~"]

Parallel Execution

Separate shells (parallel):

{
  "parallel_tasks": [
    "~| |>echo 'Shell 1'<| |~",
    "~| |>echo 'Shell 2'<| |~",
    "~| |>echo 'Shell 3'<| |~"
  ]
}

Each array element runs in its own independent shell session.

Script-Level Arguments

Pass values from the command line into your script using <ł[N]T> placeholders:

mairex analyze.jsom report.txt llama3
{
  "inputs": {
    "file": ["<ł[0]S>"],
    "model": ["<ł[1]S>"]
  },
  "analyze": {
    "setup": [
      "~| MODEL_NAME&V <S- inputs.model[0].&= |~",
      "~| A&M <S- MODEL_NAME&V |~"
    ],
    "load": [
      "~| FILE_NAME&V <S- inputs.file[0].&= |~",
      "~| FILE_NAME&V -$S> |>cat '<$>'<| -&#> A&I |~"
    ],
    "save": [
      "~| A&O -€S> summary.txt |~"
    ]
  }
}

Place placeholders as normal strings inside the JSON block, then access them inside instructions using standard JSON value access (.&=). Supports S (String), I (Integer), and L (List) types.

🎓 Learn More

🔧 Requirements

  • Python 3.8+
  • Dependencies (auto-installed):
    • ollama - Local AI model support
    • litellm - Multi-provider AI API support
    • lizard - Code analysis for function extraction
    • whats_that_code - Programming language detection

🌐 AI Provider Setup

Using Ollama (Local Models)

  1. Install Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama3
  3. No API keys needed - works out of the box!

Using Cloud AI Providers

Create API_keys.json in your working directory:

{
  "openai": "sk-your-key-here",
  "anthropic": "sk-ant-your-key-here",
  "gemini": "your-gemini-key",
  "xai": "your-xai-key"
}

Set the provider in your JSOM file:

["~| A&S <&¤S- 'openai' |~"]
["~| A&M <&¤S- 'gpt-4o' |~"]

📄 License

MIT License

Development Status

This is an early alpha release. The project is not currently accepting outside contributions. Bug reports and feedback are welcome via GitHub issues.


An experimental tool for AI orchestration

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mairex-0.9.5.tar.gz (30.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mairex-0.9.5-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file mairex-0.9.5.tar.gz.

File metadata

  • Download URL: mairex-0.9.5.tar.gz
  • Upload date:
  • Size: 30.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.5.tar.gz
Algorithm Hash digest
SHA256 77bdad9a92ad916b70ad0728bdd7268c4713cf87c0c7bb8e3f8031a31cdd2fea
MD5 31865b040e86e7f87d993666f17bd1cc
BLAKE2b-256 74e9387f1c9fd4b4e6e1cc099fba981b0a01608b45850429a4d3b0c6040b73f2

See more details on using hashes here.

File details

Details for the file mairex-0.9.5-py3-none-any.whl.

File metadata

  • Download URL: mairex-0.9.5-py3-none-any.whl
  • Upload date:
  • Size: 17.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.5-py3-none-any.whl
Algorithm Hash digest
SHA256 fb56bf1a075a34c64da100ba3fcbfeb51215425aaec18eb4f0dfaca49ea27d17
MD5 db66de2ccad9ecf47cff6c6cc1fae02e
BLAKE2b-256 25b73fb8f9328c50699b38fdd84eec15420afa138ea3247fe3b2f118e1482dea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page