An AI coding system that integrates into your Python application.
Project description
Maur 🐜
Inspired by Strip's Minions. Maur (Norwegian for "ants") is an autonomous coding agent that integrates into your Python projects. It receives tasks from various sources — production error alerts, Slack, Linear issues, or direct API calls — clones your repo, runs an AI coding agent (OpenCode), and opens a pull/merge request with the fix.
How it works
- A task arrives via webhook or direct API call
- The API stores the task and publishes it to a message queue
- A worker picks up the task, clones your repo into a temporary workspace
- OpenCode runs against the cloned repo using your configured LLM
- If changes are made, they are committed and pushed to a new branch (
maur/<task-id>) - A pull request (GitHub) or merge request (GitLab) is opened automatically
Architecture
[Trigger source] [maur_api] [maur_code_subscriber]
Linear webhook ---> FastAPI app ---> Worker (OpenCode)
Exception alert stores task clones repo
Manual POST /tasks publishes msg runs agent
opens PR/MR
The two components are deployed separately via takk:
maur_api— lightweight FastAPI service that authenticates requests, persists tasks, and enqueues workmaur_code_subscriber— NATS subscriber that processes tasks one at a time using OpenCode
Prerequisites
- Python ≥ 3.10
takkfor infrastructure management- A NATS server (provisioned by
takk) - A PostgreSQL or MySQL database (provisioned by
takk) - An OpenAI-compatible LLM API (e.g. OpenRouter, a local Ollama instance, or any provider with an OpenAI-compatible endpoint.
takkdefault to using Ollama unless you overwrite the env vars.) - A GitHub or GitLab repository with a token that has push and PR/MR creation permissions
Installation
uv add maur
Basic usage
1. Add the infrastructure
Add both components to your project.py file:
from takk import Project
from maur.components import maur_api, maur_code_subscriber
project = Project(
name="your-project",
# The API that authenticates and enqueues tasks
maur_api=maur_api(),
# The worker that clones the repo, runs OpenCode, and opens a PR/MR
maur_coder=maur_code_subscriber(git_provider="github"),
)
Both functions accept a database argument ("psql" or "mysql", default: "psql").
2. Configure secrets
Run takk dotenv to regenerate your .env file, then fill in the required values.
takk automatically provisions the database and NATS instance, so DB_URI and NATS_URI are not required when running through takk. For the LLM, takk defaults to a local Ollama instance when running locally and Scaleway when deployed — but you can override this with any OpenAI-compatible API.
| Variable | Description |
|---|---|
MAUR_ADMIN_TOKEN |
Secret token used to authenticate API requests |
GITHUB_REPO_URL |
HTTPS URL of the GitHub repo to clone and open PRs on |
GITHUB_TOKEN |
GitHub personal access token with repo scope |
For GitLab, set GITLAB_REPO_URL and GITLAB_TOKEN instead. See Customisation for optional integrations (Grafana, Slack, Linear) and all other available env vars.
3. Start the system
takk up
Both the API and worker containers will be built and started.
API reference
All endpoints (except /health) require a Bearer token in the Authorization header matching MAUR_ADMIN_TOKEN.
POST /tasks — Manual task
Send any arbitrary prompt to the agent.
curl -X POST http://localhost:8000/tasks \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{
"prompt": "Refactor the payment module to use the new Stripe SDK",
"source_id": "unique-identifier-for-dedup",
"repo_branch": "main"
}'
POST /webhooks/exception — Exception alert
Send a production error for the agent to fix. fingerprint is used for deduplication — tasks with the same fingerprint that are already pending or in progress are rejected.
curl -X POST http://localhost:8000/webhooks/exception \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{
"fingerprint": "KeyError-user-profile-views-42",
"title": "KeyError: '\''email'\'' in user_profile view",
"description": "Traceback (most recent call last):\n ...",
"repo_branch": "main",
"extra": {"environment": "production", "user_id": 123}
}'
GET /tasks — List tasks
Returns the 50 most recent tasks.
GET /tasks/{task_id} — Get task
Returns the status and result of a specific task.
GET /health
Returns "ok". Used for health checks.
Customisation
Optional integrations
Webhook routes are only registered when their corresponding env vars are present. Set the vars for the integrations you want; leave them unset to disable.
Grafana alerts — POST /grafana/webhook
Point a Grafana contact point at this URL.
| Variable | Required | Description |
|---|---|---|
MATCH_LABELS |
No | JSON object to filter alerts by label (e.g. {"severity":"critical"}) |
REPO_BRANCH |
No | Branch to create fixes on (default: main) |
PROMPT_TEMPLATE |
No | Custom prompt — placeholders: {body}, {alert}, {tracebacks} |
LOKI_URL |
No | Enables automatic log fetching for the alert |
LOKI_TOKEN |
No | Loki auth token (required when LOKI_URL is set) |
Slack events — POST /slack/webhook
Point your Slack app's Event Subscriptions at this URL. Handles app_mention and message events.
| Variable | Required | Description |
|---|---|---|
SLACK_SIGNING_SECRET |
Yes (enables route) | Signing secret from your Slack app settings |
CHANNEL_FILTER |
No | Only handle messages from this channel ID |
REPO_BRANCH |
No | Branch to create fixes on (default: main) |
PROMPT_TEMPLATE |
No | Custom prompt — placeholders: {channel}, {text} |
Linear issues — POST /linear/webhook
Point a Linear webhook at this URL. Triggers on issue create/update and posts a comment back with the task ID.
| Variable | Required | Description |
|---|---|---|
LINEAR_WEBHOOK_SECRET |
Yes (enables route) | Webhook signing secret from Linear |
LINEAR_API_KEY |
Yes | Linear API key — used to post a comment back on the issue |
LABEL_FILTER |
No | Only handle issues with this label (e.g. maur) |
REPO_BRANCH |
No | Branch to create fixes on (default: main) |
PROMPT_TEMPLATE |
No | Custom prompt — placeholders: {title}, {description} |
Custom webhooks
If you need to handle a webhook source not covered by the built-in integrations, extend the FastAPI app directly and point maur_api at your module.
Create a file in your project, e.g. my_project/app.py:
from fastapi import APIRouter, HTTPException
from maur.app import app
from maur.models import CodingTask
from maur.repos import CodingTaskRepoDep, DuplicateTaskError
from maur.components import worker_tasks, CodingTaskMessage
router = APIRouter()
@router.post("/webhooks/my-service")
async def my_service_webhook(payload: dict, repo: CodingTaskRepoDep):
try:
task = await repo.insert(CodingTask(
source="my-service",
source_id=payload["id"], # unique identifier for deduplication
repo_branch="main",
prompt=f"Handle this event: {payload}",
))
# Let the worker know that it can start on the task.
await worker_tasks.publish(CodingTaskMessage(task_id=task.id))
except DuplicateTaskError as e:
raise HTTPException(status_code=409, detail={"message": "Task already in progress", "task_id": str(e.existing_task_id)})
app.include_router(router)
Then pass the module path to maur_api:
from takk import Project
from maur.components import maur_api, maur_code_subscriber
project = Project(
name="your-project",
maur_api=maur_api(app_module="my_project.app"),
maur_coder=maur_code_subscriber(git_provider="github"),
)
takk will serve your module instead of maur.app directly, so all existing routes and lifespan logic remain intact alongside your additions.
Override the infrastructure
| Variable | Default | Description |
|---|---|---|
DB_URI |
Provisioned by takk |
PostgreSQL (postgresql://...) or MySQL (mysql://...) connection URI |
NATS_URI |
Provisioned by takk |
NATS connection URI (nats://...) |
Changing the LLM model
Set MAUR_LLM_MODEL to any model available through your MAUR_LLM_API provider. The worker uses OpenCode with an OpenAI-compatible provider, so any model exposed via that protocol works.
MAUR_LLM_MODEL=devstral-2-123b-instruct-2512
Adjusting worker compute resources
The default worker is allocated 3 GB of memory. Override this via the compute argument:
from takk.models import Compute
from maur.components import maur_code_subscriber
maur_coder=maur_code_subscriber(
compute=Compute(mb_memory_limit=1024 * 8) # 8 GB
)
Passing additional secrets to the worker
If your target repository requires environment variables at build or runtime (e.g. private package indexes), use maur_code_subscriber_with_secrets and pass the full list of secrets explicitly:
from maur.components import maur_code_subscriber_with_secrets
from maur.settings import GithubSettings, MaurSettings, MaurLLMSettings, PostgresSettings
from takk.secrets import NatsConfig
from my_project.settings import MyPrivateRegistrySettings
maur_coder=maur_code_subscriber_with_secrets(
secrets=[PostgresSettings, MaurSettings, MaurLLMSettings, GithubSettings, NatsConfig, MyPrivateRegistrySettings]
)
Running without takk
takk is the easiest way to run and deploy Maur, but you can run both components directly if you prefer to manage infrastructure yourself.
Docker Compose
The quickest way to run without takk is with Docker Compose. Create a docker-compose.yml:
services:
db:
image: postgres:16
environment:
POSTGRES_USER: maur
POSTGRES_PASSWORD: maur
POSTGRES_DB: maur
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U maur"]
interval: 5s
retries: 5
nats:
image: nats:latest
command: ["-js"]
ports:
- "4222:4222"
api:
build:
context: .
dockerfile: Dockerfile.api
ports:
- "8000:8000"
environment:
DB_URI: postgresql+asyncpg://maur:maur@db/maur
NATS_URI: nats://nats:4222
MAUR_ADMIN_TOKEN: your-secret-token
depends_on:
db:
condition: service_healthy
nats:
condition: service_started
worker:
build:
context: .
dockerfile: Dockerfile.worker
environment:
DB_URI: postgresql+asyncpg://maur:maur@db/maur
NATS_URI: nats://nats:4222
MAUR_ADMIN_TOKEN: your-secret-token
MAUR_LLM_API: https://your-llm-provider/v1
MAUR_LLM_TOKEN: your-llm-token
GITHUB_REPO_URL: https://github.com/your-org/your-repo
GITHUB_TOKEN: ghp_...
depends_on:
db:
condition: service_healthy
nats:
condition: service_started
volumes:
postgres_data:
Then run:
docker compose up
See the API and Worker sections below for the corresponding Dockerfiles.
API
Install the package and start the FastAPI app with uvicorn:
pip install maur
uvicorn maur.app:app --host 0.0.0.0 --port 8000
You must supply all required environment variables manually — takk won't provision anything:
DB_URI=postgresql+asyncpg://user:pass@localhost/maur
NATS_URI=nats://localhost:4222
MAUR_ADMIN_TOKEN=your-secret-token
Worker
The worker calls process_coding_task for each NATS message. It requires a number of system-level dependencies (git, grep, curl, unzip, jq, bash) and OpenCode to be installed in the environment.
A minimal Dockerfile for the worker:
FROM python:3.12-slim
RUN apt-get update && apt-get install -y \
curl ca-certificates bash git libstdc++6 libgcc-s1 unzip jq grep \
&& rm -rf /var/lib/apt/lists/*
# Install OpenCode
RUN curl -fsSL https://opencode.ai/install | bash
RUN pip install maur
You then need to wire up a NATS consumer that deserialises the message as CodingTaskMessage and calls process_coding_task:
import asyncio
import json
import nats
from maur.worker import CodingTaskMessage, process_coding_task
async def main():
nc = await nats.connect("nats://localhost:4222")
js = nc.jetstream()
async def handler(msg):
await msg.ack()
task_msg = CodingTaskMessage.model_validate(json.loads(msg.data))
await process_coding_task(task_msg)
await js.subscribe("maur.tasks", cb=handler, durable="maur-worker")
await asyncio.Event().wait()
asyncio.run(main())
The worker also requires the database and Git provider environment variables set:
DB_URI=postgresql+asyncpg://user:pass@localhost/maur
NATS_URI=nats://localhost:4222
MAUR_ADMIN_TOKEN=your-secret-token
MAUR_LLM_API=https://...
MAUR_LLM_TOKEN=your-llm-token
GITHUB_REPO_URL=https://github.com/your-org/your-repo
GITHUB_TOKEN=ghp_...
Development
# Install dependencies
uv sync --all-groups
Generate a .env file and fill in the required variables:
uv run takk dotenv
Start the full stack locally:
uv run takk up
When you want to test a change, run the integration test suite:
uv run takk test
# Lint
uv run ruff check .
# Type check
uv run ty check
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file maur-0.1.6.tar.gz.
File metadata
- Download URL: maur-0.1.6.tar.gz
- Upload date:
- Size: 20.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e6406cb66c9b2a91d699806deaa5d0f89c2166b6851b40dad16aa18cdd92a737
|
|
| MD5 |
f7c6b5523535ca408778cc2ba8e9d832
|
|
| BLAKE2b-256 |
2d3897488df3165b69848c05cf01b4751d7e00512f41b73b11fe23b3324518c2
|
File details
Details for the file maur-0.1.6-py3-none-any.whl.
File metadata
- Download URL: maur-0.1.6-py3-none-any.whl
- Upload date:
- Size: 29.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
153a64c5412e0bfbdedd5ff386760e0dc661f642ee0780a1b6225bd2dd227674
|
|
| MD5 |
0504843bbc240f0f14181f1cdcbc0336
|
|
| BLAKE2b-256 |
b91088af9cd4fd2a09cd37e9c1423d0a8e6fb9dd50aabe6bc1198c2dc2587aee
|