Skip to main content

An AI coding system that integrates into your Python application.

Project description

Maur 🐜

Inspired by Strip's Minions. Maur (Norwegian for "ants") is an autonomous coding agent that integrates into your Python projects. It receives tasks from various sources — production error alerts, Slack, Linear issues, or direct API calls — clones your repo, runs an AI coding agent (OpenCode), and opens a pull/merge request with the fix.

How it works

  1. A task arrives via webhook or direct API call
  2. The API stores the task and publishes it to a message queue
  3. A worker picks up the task, clones your repo into a temporary workspace
  4. OpenCode runs against the cloned repo using your configured LLM
  5. If changes are made, they are committed and pushed to a new branch (maur/<task-id>)
  6. A pull request (GitHub) or merge request (GitLab) is opened automatically

Architecture

[Trigger source]           [maur_api]          [maur_code_subscriber]
  Linear webhook    --->   FastAPI app   --->   Worker (OpenCode)
  Exception alert          stores task          clones repo
  Manual POST /tasks       publishes msg        runs agent
                                                opens PR/MR

The two components are deployed separately via takk:

  • maur_api — lightweight FastAPI service that authenticates requests, persists tasks, and enqueues work
  • maur_code_subscriber — NATS subscriber that processes tasks one at a time using OpenCode

Prerequisites

  • Python ≥ 3.10
  • takk for infrastructure management
  • A NATS server (provisioned by takk)
  • A PostgreSQL or MySQL database (provisioned by takk)
  • An OpenAI-compatible LLM API (e.g. OpenRouter, a local Ollama instance, or any provider with an OpenAI-compatible endpoint. takk default to using Ollama unless you overwrite the env vars.)
  • A GitHub or GitLab repository with a token that has push and PR/MR creation permissions

Installation

uv add maur

Basic usage

1. Add the infrastructure

Add both components to your project.py file:

from takk import Project
from maur.components import maur_api, maur_code_subscriber

project = Project(
    name="your-project",

    # The API that authenticates and enqueues tasks
    maur_api=maur_api(),

    # The worker that clones the repo, runs OpenCode, and opens a PR/MR
    maur_coder=maur_code_subscriber(git_provider="github"),
)

Both functions accept a database argument ("psql" or "mysql", default: "psql").

2. Configure secrets

Run takk dotenv to regenerate your .env file, then fill in the required values.

takk automatically provisions the database and NATS instance, so DB_URI and NATS_URI are not required when running through takk. For the LLM, takk defaults to a local Ollama instance when running locally and Scaleway when deployed — but you can override this with any OpenAI-compatible API.

Variable Description
MAUR_ADMIN_TOKEN Secret token used to authenticate API requests
GITHUB_REPO_URL HTTPS URL of the GitHub repo to clone and open PRs on
GITHUB_TOKEN GitHub personal access token with repo scope

For GitLab, set GITLAB_REPO_URL and GITLAB_TOKEN instead. See Customisation for optional integrations (Grafana, Slack, Linear) and all other available env vars.

3. Start the system

takk up

Both the API and worker containers will be built and started.

API reference

All endpoints (except /health) require a Bearer token in the Authorization header. You can use MAUR_ADMIN_TOKEN directly, but the recommended approach is to create dedicated tokens via POST /tokens (using the admin token once to bootstrap) and use those for day-to-day API calls.

POST /tokens — Create token

Creates a new API token. Requires the admin token. Returns the token value — store it, as it won't be shown again.

curl -X POST http://localhost:8000/tokens \
  -H "Authorization: Bearer <admin-token>" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-integration",
    "description": "Token for CI pipeline",
    "expires_at": "2027-01-01T00:00:00Z"
  }'

GET /tokens — List tokens

Returns all active tokens (token values are redacted).

DELETE /tokens/{token_id} — Revoke token

Revokes a token by ID.

POST /tasks — Manual task

Send any arbitrary prompt to the agent.

curl -X POST http://localhost:8000/tasks \
  -H "Authorization: Bearer <token>" \
  -H "Content-Type: application/json" \
  -d '{
    "prompt": "Refactor the payment module to use the new Stripe SDK",
    "source_id": "unique-identifier-for-dedup",
    "repo_branch": "main"
  }'

POST /webhooks/exception — Exception alert

Send a production error for the agent to fix. fingerprint is used for deduplication — tasks with the same fingerprint that are already pending or in progress are rejected.

curl -X POST http://localhost:8000/webhooks/exception \
  -H "Authorization: Bearer <token>" \
  -H "Content-Type: application/json" \
  -d '{
    "fingerprint": "KeyError-user-profile-views-42",
    "title": "KeyError: '\''email'\'' in user_profile view",
    "description": "Traceback (most recent call last):\n  ...",
    "repo_branch": "main",
    "extra": {"environment": "production", "user_id": 123}
  }'

GET /tasks — List tasks

Returns the 50 most recent tasks.

GET /tasks/{task_id} — Get task

Returns the status and result of a specific task.

GET /health

Returns "ok". Used for health checks.

Customisation

Optional integrations

Webhook routes are only registered when their corresponding env vars are present. Set the vars for the integrations you want; leave them unset to disable.

Grafana alertsPOST /grafana/webhook

Point a Grafana contact point at this URL.

Variable Required Description
MATCH_LABELS No JSON object to filter alerts by label (e.g. {"severity":"critical"})
REPO_BRANCH No Branch to create fixes on (default: main)
PROMPT_TEMPLATE No Custom prompt — placeholders: {body}, {alert}, {tracebacks}
LOKI_URL No Enables automatic log fetching for the alert
LOKI_TOKEN No Loki auth token (required when LOKI_URL is set)

Slack eventsPOST /slack/webhook

Point your Slack app's Event Subscriptions at this URL. Handles app_mention and message events.

Variable Required Description
SLACK_SIGNING_SECRET Yes (enables route) Signing secret from your Slack app settings
CHANNEL_FILTER No Only handle messages from this channel ID
REPO_BRANCH No Branch to create fixes on (default: main)
PROMPT_TEMPLATE No Custom prompt — placeholders: {channel}, {text}

Linear issuesPOST /linear/webhook

Point a Linear webhook at this URL. Triggers on issue create/update and posts a comment back with the task ID.

Variable Required Description
LINEAR_WEBHOOK_SECRET Yes (enables route) Webhook signing secret from Linear
LINEAR_API_KEY Yes Linear API key — used to post a comment back on the issue
LABEL_FILTER No Only handle issues with this label (e.g. maur)
REPO_BRANCH No Branch to create fixes on (default: main)
PROMPT_TEMPLATE No Custom prompt — placeholders: {title}, {description}

Custom webhooks

If you need to handle a webhook source not covered by the built-in integrations, extend the FastAPI app directly and point maur_api at your module.

Create a file in your project, e.g. my_project/app.py:

from fastapi import APIRouter, HTTPException
from maur.app import app
from maur.models import CodingTask
from maur.repos import CodingTaskRepoDep, DuplicateTaskError
from maur.components import worker_tasks, CodingTaskMessage

router = APIRouter()

@router.post("/webhooks/my-service")
async def my_service_webhook(payload: dict, repo: CodingTaskRepoDep):
    try:
        task = await repo.insert(CodingTask(
            source="my-service",
            source_id=payload["id"],  # unique identifier for deduplication
            repo_branch="main",
            prompt=f"Handle this event: {payload}",
        ))

        # Let the worker know that it can start on the task.
        await worker_tasks.publish(CodingTaskMessage(task_id=task.id))
    except DuplicateTaskError as e:
        raise HTTPException(status_code=409, detail={"message": "Task already in progress", "task_id": str(e.existing_task_id)})
    

app.include_router(router)

Then pass the module path to maur_api:

from takk import Project
from maur.components import maur_api, maur_code_subscriber

project = Project(
    name="your-project",
    maur_api=maur_api(app_module="my_project.app"),
    maur_coder=maur_code_subscriber(git_provider="github"),
)

takk will serve your module instead of maur.app directly, so all existing routes and lifespan logic remain intact alongside your additions.

Override the infrastructure

Variable Default Description
DB_URI Provisioned by takk PostgreSQL (postgresql://...) or MySQL (mysql://...) connection URI
NATS_URI Provisioned by takk NATS connection URI (nats://...)

Changing the LLM model

Set MAUR_LLM_MODEL to any model available through your MAUR_LLM_API provider. The worker uses OpenCode with an OpenAI-compatible provider, so any model exposed via that protocol works.

MAUR_LLM_MODEL=devstral-2-123b-instruct-2512

Adjusting worker compute resources

The default worker is allocated 3 GB of memory. Override this via the compute argument:

from takk.models import Compute
from maur.components import maur_code_subscriber

maur_coder=maur_code_subscriber(
    compute=Compute(mb_memory_limit=1024 * 8)  # 8 GB
)

Passing additional secrets to the worker

If your target repository requires environment variables at build or runtime (e.g. private package indexes), use maur_code_subscriber_with_secrets and pass the full list of secrets explicitly:

from maur.components import maur_code_subscriber_with_secrets
from maur.settings import GithubSettings, MaurSettings, MaurLLMSettings, PostgresSettings
from takk.secrets import NatsConfig
from my_project.settings import MyPrivateRegistrySettings

maur_coder=maur_code_subscriber_with_secrets(
    secrets=[PostgresSettings, MaurSettings, MaurLLMSettings, GithubSettings, NatsConfig, MyPrivateRegistrySettings]
)

Running without takk

takk is the easiest way to run and deploy Maur, but you can run both components directly if you prefer to manage infrastructure yourself.

Docker Compose

The quickest way to run without takk is with Docker Compose. Create a docker-compose.yml:

services:
  db:
    image: postgres:16
    environment:
      POSTGRES_USER: maur
      POSTGRES_PASSWORD: maur
      POSTGRES_DB: maur
    volumes:
      - postgres_data:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U maur"]
      interval: 5s
      retries: 5

  nats:
    image: nats:latest
    command: ["-js"]
    ports:
      - "4222:4222"

  api:
    build:
      context: .
      dockerfile: Dockerfile.api
    ports:
      - "8000:8000"
    environment:
      DB_URI: postgresql+asyncpg://maur:maur@db/maur
      NATS_URI: nats://nats:4222
      MAUR_ADMIN_TOKEN: your-secret-token
    depends_on:
      db:
        condition: service_healthy
      nats:
        condition: service_started

  worker:
    build:
      context: .
      dockerfile: Dockerfile.worker
    environment:
      DB_URI: postgresql+asyncpg://maur:maur@db/maur
      NATS_URI: nats://nats:4222
      MAUR_ADMIN_TOKEN: your-secret-token
      MAUR_LLM_API: https://your-llm-provider/v1
      MAUR_LLM_TOKEN: your-llm-token
      GITHUB_REPO_URL: https://github.com/your-org/your-repo
      GITHUB_TOKEN: ghp_...
    depends_on:
      db:
        condition: service_healthy
      nats:
        condition: service_started

volumes:
  postgres_data:

Then run:

docker compose up

See the API and Worker sections below for the corresponding Dockerfiles.

API

Install the package and start the FastAPI app with uvicorn:

pip install maur
uvicorn maur.app:app --host 0.0.0.0 --port 8000

You must supply all required environment variables manually — takk won't provision anything:

DB_URI=postgresql+asyncpg://user:pass@localhost/maur
NATS_URI=nats://localhost:4222
MAUR_ADMIN_TOKEN=your-secret-token

Worker

The worker calls process_coding_task for each NATS message. It requires a number of system-level dependencies (git, grep, curl, unzip, jq, bash) and OpenCode to be installed in the environment.

A minimal Dockerfile for the worker:

FROM python:3.12-slim

RUN apt-get update && apt-get install -y \
    curl ca-certificates bash git libstdc++6 libgcc-s1 unzip jq grep \
    && rm -rf /var/lib/apt/lists/*

# Install OpenCode
RUN curl -fsSL https://opencode.ai/install | bash

RUN pip install maur

You then need to wire up a NATS consumer that deserialises the message as CodingTaskMessage and calls process_coding_task:

import asyncio
import json
import nats
from maur.worker import CodingTaskMessage, process_coding_task

async def main():
    nc = await nats.connect("nats://localhost:4222")
    js = nc.jetstream()

    async def handler(msg):
        await msg.ack()
        task_msg = CodingTaskMessage.model_validate(json.loads(msg.data))
        await process_coding_task(task_msg)

    await js.subscribe("maur.tasks", cb=handler, durable="maur-worker")
    await asyncio.Event().wait()

asyncio.run(main())

The worker also requires the database and Git provider environment variables set:

DB_URI=postgresql+asyncpg://user:pass@localhost/maur
NATS_URI=nats://localhost:4222
MAUR_ADMIN_TOKEN=your-secret-token
MAUR_LLM_API=https://...
MAUR_LLM_TOKEN=your-llm-token
GITHUB_REPO_URL=https://github.com/your-org/your-repo
GITHUB_TOKEN=ghp_...

Development

# Install dependencies
uv sync --all-groups

Generate a .env file and fill in the required variables:

uv run takk dotenv

Start the full stack locally:

uv run takk up

When you want to test a change, run the integration test suite:

uv run takk test
# Lint
uv run ruff check .

# Type check
uv run ty check

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

maur-0.1.7.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

maur-0.1.7-py3-none-any.whl (29.9 kB view details)

Uploaded Python 3

File details

Details for the file maur-0.1.7.tar.gz.

File metadata

  • Download URL: maur-0.1.7.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for maur-0.1.7.tar.gz
Algorithm Hash digest
SHA256 8c0e5d598b3d710111668871eb2b09c035cfc5681b81d1ff686f5d62a6db8410
MD5 8ca94bd718dff1aba2de9fd95f71c124
BLAKE2b-256 88ad1da9ab8f61d2dc350a6f681361734104651418d9106bace95b692d382763

See more details on using hashes here.

File details

Details for the file maur-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: maur-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 29.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for maur-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 d326f231dab1fba3654d2949a1c3631c3eec7687d74f80e6961cbf30d17ed6f1
MD5 fead4c28281ffac89225d8dfef85d584
BLAKE2b-256 681dab1028a3cf01f44c0d8bbde3aab3655e511f463e71b91f8b785f6eb8fdc6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page