Skip to main content

A high-level orchestration language for AI models and system commands

Project description

Mairex

An experimental orchestration language for AI models and system commands

Mairex is an orchestration language that allows you to coordinate AI models, shell commands, and data flows using a JSON-based syntax with specialized operators. It's designed for developers who want to prototype AI workflows and automation scripts.

⚠️ Alpha Status: Mairex is in early development (v0.9.2). Expect bugs, missing features, and potential breaking changes in future versions.

Current Limitations

  • Sequential execution only (parallel execution syntax exists but runs sequentially)
  • Basic error handling
  • Limited debugging capabilities
  • Many planned features not yet implemented
  • Documentation may be ahead of implementation in some areas

What It Does

Mairex lets you write scripts that combine shell commands and AI model calls in a declarative way. For example, you can download websites, process them with AI models, and save outputs to files - all coordinated through a single .jsom file.

✨ Features

  • 🤖 Native AI Integration - Call Ollama, OpenAI, Anthropic, Gemini, and XAI models directly
  • 🔄 Parallel Execution - Run multiple shells and AI models concurrently
  • 🔗 Chainable Operations - Flow data between commands, files, and AI models
  • 📦 Variable Scoping - Custom and AI-specific variable management
  • 🎯 JSON-Based Syntax - Familiar structure with powerful extensions
  • 🛠️ Shell Integration - Execute any terminal command with persistent shell sessions
  • 🧩 Script-Level Arguments - Pass runtime parameters to scripts from the command line

🚀 Quick Start

Installation

pip install mairex

Your First Mairex Script

Create a file hello.jsom:

{
  "greeting": {
    "set_input": [
      "~| A&I <&¤S- 'World' |~"
    ],
    "set_prompt": [
      "~| A&P <&¤S- 'Say hello to the input' |~"
    ],
    "call_ai": [
      "~| A&O -$S> |>echo '<$>'<| |~"
    ]
  }
}

Run it:

mairex hello.jsom

What this does:

  1. Sets AI input to "World"
  2. Sets AI prompt to "Say hello to the input"
  3. Calls the AI model and echoes the response

📚 Core Concepts

JSOM Files

Mairex scripts use .jsom files (JSON + Mairex). They follow standard JSON syntax with one rule:

All leaf nodes must be arrays:

{
  "step": {
    "action": ["value"]
  }
}

NOT:

{
  "step": {
    "action": "value"
  }
}

Instructions

Instructions are declared between ~| |~ specifiers:

["~| |>echo 'Hello'<| |~"]

Shell commands go between |> <|:

["~| |>ls -la<| |~"]

Variables

Custom Variables (shared across shells, scoped to function):

["~| VAR&V <&¤S- 'my value' |~"]

AI Variables (shell-specific, persistent across tree levels):

["~| A&I <&¤S- 'AI input' |~"]

Data Flow

Left to right:

["~| |>echo 'output'<| -&#> FILE&V -€S> result.txt |~"]

Right to left:

["~| FILE&V <&€- result.txt <&#- |>cat file.txt<| |~"]

Parallel Execution

Separate shells (parallel):

{
  "parallel_tasks": [
    "~| |>echo 'Shell 1'<| |~",
    "~| |>echo 'Shell 2'<| |~",
    "~| |>echo 'Shell 3'<| |~"
  ]
}

Each array element runs in its own independent shell session.

Script-Level Arguments

Pass values from the command line into your script using <ł[N]T> placeholders:

mairex analyze.jsom report.txt llama3
{
  "inputs": {
    "file": ["<ł[0]S>"],
    "model": ["<ł[1]S>"]
  },
  "analyze": {
    "setup": [
      "~| MODEL_NAME&V <S- inputs.model[0].&= |~",
      "~| A&M <S- MODEL_NAME&V |~"
    ],
    "load": [
      "~| FILE_NAME&V <S- inputs.file[0].&= |~",
      "~| FILE_NAME&V -$S> |>cat '<$>'<| -&#> A&I |~"
    ],
    "save": [
      "~| A&O -€S> summary.txt |~"
    ]
  }
}

Place placeholders as normal strings inside the JSON block, then access them inside instructions using standard JSON value access (.&=). Supports S (String), I (Integer), and L (List) types.

🎓 Learn More

🔧 Requirements

  • Python 3.8+
  • Dependencies (auto-installed):
    • ollama - Local AI model support
    • litellm - Multi-provider AI API support
    • lizard - Code analysis for function extraction
    • whats_that_code - Programming language detection

🌐 AI Provider Setup

Using Ollama (Local Models)

  1. Install Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama3
  3. No API keys needed - works out of the box!

Using Cloud AI Providers

Create API_keys.json in your working directory:

{
  "openai": "sk-your-key-here",
  "anthropic": "sk-ant-your-key-here",
  "gemini": "your-gemini-key",
  "xai": "your-xai-key"
}

Set the provider in your JSOM file:

["~| A&S <&¤S- 'openai' |~"]
["~| A&M <&¤S- 'gpt-4o' |~"]

📄 License

MIT License

Development Status

This is an early alpha release. The project is not currently accepting outside contributions. Bug reports and feedback are welcome via GitHub issues.


An experimental tool for AI orchestration

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mairex-0.9.2.tar.gz (29.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mairex-0.9.2-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file mairex-0.9.2.tar.gz.

File metadata

  • Download URL: mairex-0.9.2.tar.gz
  • Upload date:
  • Size: 29.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.2.tar.gz
Algorithm Hash digest
SHA256 d22ef16358be7664d6a6d9969d2fbe9b75f1c584d52912f01bd540904b995d52
MD5 43db3810b080eb9c7cbd167e582d4cc1
BLAKE2b-256 9e367c75d83bdebb0b826f1304ba92b370c8bb160482f84fcf4be3f8488cc102

See more details on using hashes here.

File details

Details for the file mairex-0.9.2-py3-none-any.whl.

File metadata

  • Download URL: mairex-0.9.2-py3-none-any.whl
  • Upload date:
  • Size: 16.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for mairex-0.9.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ae752565d4da9802d9d2c3820c32462ac183d03b89006c182e0f332203f14c7c
MD5 e881c16d30010bfb926eb67fcafa56cc
BLAKE2b-256 849e5e8998328277e79f84f4cb28f6b3fe26adae0c728efdb4d2efc293bab193

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page