Skip to main content

A high-level orchestration language for AI models and system commands

Project description

Mairex

An experimental orchestration language for AI models and system commands

Mairex is an orchestration language that allows you to coordinate AI models, shell commands, and data flows using a JSON-based syntax with specialized operators. It's designed for developers who want to prototype AI workflows and automation scripts.

⚠️ Alpha Status: Mairex is in early development (v0.9.2). Expect bugs, missing features, and potential breaking changes in future versions.

Current Limitations

  • Sequential execution only (parallel execution syntax exists but runs sequentially)
  • Basic error handling
  • Limited debugging capabilities
  • Many planned features not yet implemented
  • Documentation may be ahead of implementation in some areas

What It Does

Mairex lets you write scripts that combine shell commands and AI model calls in a declarative way. For example, you can download websites, process them with AI models, and save outputs to files - all coordinated through a single .jsom file.

✨ Features

  • 🤖 Native AI Integration - Call Ollama, OpenAI, Anthropic, Gemini, and XAI models directly
  • 🔄 Parallel Execution - Run multiple shells and AI models concurrently
  • 🔗 Chainable Operations - Flow data between commands, files, and AI models
  • 📦 Variable Scoping - Custom and AI-specific variable management
  • 🎯 JSON-Based Syntax - Familiar structure with powerful extensions
  • 🛠️ Shell Integration - Execute any terminal command with persistent shell sessions
  • 🧩 Script-Level Arguments - Pass runtime parameters to scripts from the command line

🚀 Quick Start

Installation

pip install mairex

Your First Mairex Script

Create a file hello.jsom:

{
  "greeting": {
    "set_input": [
      "~| A&I <&¤S- 'World' |~"
    ],
    "set_prompt": [
      "~| A&P <&¤S- 'Say hello to the input' |~"
    ],
    "call_ai": [
      "~| A&O -$S> |>echo '<$>'<| |~"
    ]
  }
}

Run it:

mairex hello.jsom

What this does:

  1. Sets AI input to "World"
  2. Sets AI prompt to "Say hello to the input"
  3. Calls the AI model and echoes the response

📚 Core Concepts

JSOM Files

Mairex scripts use .jsom files (JSON + Mairex). They follow standard JSON syntax with one rule:

All leaf nodes must be arrays:

{
  "step": {
    "action": ["value"]
  }
}

NOT:

{
  "step": {
    "action": "value"
  }
}

Instructions

Instructions are declared between ~| |~ specifiers:

["~| |>echo 'Hello'<| |~"]

Shell commands go between |> <|:

["~| |>ls -la<| |~"]

Variables

Custom Variables (shared across shells, scoped to function):

["~| VAR&V <&¤S- 'my value' |~"]

AI Variables (shell-specific, persistent across tree levels):

["~| A&I <&¤S- 'AI input' |~"]

Data Flow

Left to right:

["~| |>echo 'output'<| -&#> FILE&V -€S> result.txt |~"]

Right to left:

["~| FILE&V <&€- result.txt <&#- |>cat file.txt<| |~"]

Parallel Execution

Separate shells (parallel):

{
  "parallel_tasks": [
    "~| |>echo 'Shell 1'<| |~",
    "~| |>echo 'Shell 2'<| |~",
    "~| |>echo 'Shell 3'<| |~"
  ]
}

Each array element runs in its own independent shell session.

Script-Level Arguments

Pass values from the command line into your script using <ł[N]T> placeholders:

mairex analyze.jsom report.txt llama3
{
  "inputs": {
    "file": ["<ł[0]S>"],
    "model": ["<ł[1]S>"]
  },
  "analyze": {
    "setup": [
      "~| MODEL_NAME&V <S- inputs.model[0].&= |~",
      "~| A&M <S- MODEL_NAME&V |~"
    ],
    "load": [
      "~| FILE_NAME&V <S- inputs.file[0].&= |~",
      "~| FILE_NAME&V -$S> |>cat '<$>'<| -&#> A&I |~"
    ],
    "save": [
      "~| A&O -€S> summary.txt |~"
    ]
  }
}

Place placeholders as normal strings inside the JSON block, then access them inside instructions using standard JSON value access (.&=). Supports S (String), I (Integer), and L (List) types.

🎓 Learn More

🔧 Requirements

  • Python 3.8+
  • Dependencies (auto-installed):
    • ollama - Local AI model support
    • litellm - Multi-provider AI API support
    • lizard - Code analysis for function extraction
    • whats_that_code - Programming language detection

🌐 AI Provider Setup

Using Ollama (Local Models)

  1. Install Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama3
  3. No API keys needed - works out of the box!

Using Cloud AI Providers

Create API_keys.json in your working directory:

{
  "openai": "sk-your-key-here",
  "anthropic": "sk-ant-your-key-here",
  "gemini": "your-gemini-key",
  "xai": "your-xai-key"
}

Set the provider in your JSOM file:

["~| A&S <&¤S- 'openai' |~"]
["~| A&M <&¤S- 'gpt-4o' |~"]

📄 License

MIT License

Development Status

This is an early alpha release. The project is not currently accepting outside contributions. Bug reports and feedback are welcome via GitHub issues.


An experimental tool for AI orchestration

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mairex-0.9.4.tar.gz (30.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mairex-0.9.4-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file mairex-0.9.4.tar.gz.

File metadata

  • Download URL: mairex-0.9.4.tar.gz
  • Upload date:
  • Size: 30.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.4.tar.gz
Algorithm Hash digest
SHA256 7e969b24759f1f0896f6c4147b16c3d0c264e60a26863408cc53c0ff9c5282bd
MD5 68c9865c673cda3fb20d8bb089448373
BLAKE2b-256 e3f72d872e286b6a3d69c571442f2ce6035fab702df9a5c374a7e844d5ab54df

See more details on using hashes here.

File details

Details for the file mairex-0.9.4-py3-none-any.whl.

File metadata

  • Download URL: mairex-0.9.4-py3-none-any.whl
  • Upload date:
  • Size: 17.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c701f6a3bf52416c8f800451789a1e86140cdb1325c202c205dbf84cc79607e3
MD5 0892df47ab8a413e3b996a5d0dcb8885
BLAKE2b-256 99b687de7d7b0b4c9ad0cecfcca19e3442ce807cce7bb47920ea01de64e0dec4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page