Quartermaster — modular AI agent orchestration framework. Install this to get all packages.
Project description
Quartermaster SDK
Modular AI agent orchestration framework by MindMade.
Quartermaster lets you build AI agent workflows as directed graphs — define nodes (LLM calls, decisions, user input, tools), connect them with edges, and execute them with a pluggable engine.
Quick Install
# Core framework (graph + providers + tools + nodes + engine)
pip install quartermaster-sdk
# With OpenAI
pip install quartermaster-sdk[openai]
# With everything (all providers, all tools, MCP client, code runner)
pip install quartermaster-sdk[all]
Quick Start (local Ollama, zero config)
ollama pull gemma4:26b # or any model you've pulled
from quartermaster_sdk import Graph, FlowRunner, register_local
provider_registry = register_local(
"ollama",
base_url="http://localhost:11434", # or set $OLLAMA_HOST
default_model="gemma4:26b",
)
graph = Graph("chat").start().user().agent().end().build()
runner = FlowRunner(graph=graph, provider_registry=provider_registry)
result = runner.run("Pozdravljen, koliko je ura?")
print(result.final_output)
Quick Start (cloud provider)
from quartermaster_sdk import Graph
agent = (
Graph("My Agent")
.start()
.user("What can I help you with?")
.instruction("Respond", model="gpt-4o", system_instruction="You are a helpful assistant.")
.end()
)
Sync chat shim (no graph needed)
For one-shot LLM calls from sync code (Celery workers, Django views, CLI scripts) —
no asgiref.async_to_sync wrapper required:
from quartermaster_providers.providers.local import OllamaProvider
provider = OllamaProvider(default_model="gemma4:26b")
result = provider.chat(
messages=[{"role": "user", "content": "Pozdravljen!"}],
max_output_tokens=128,
thinking_level="off",
)
print(result.content) # promoted from `reasoning` if `content` is empty
print(result.usage) # {prompt_tokens, completion_tokens, total_tokens}
Packages
| Package | Description |
|---|---|
quartermaster-graph |
Graph schema, builder API, validation |
quartermaster-providers |
LLM provider abstraction (OpenAI, Anthropic, Google, Groq, local) |
quartermaster-tools |
Tool definition, registry, built-in tools |
quartermaster-nodes |
Node execution protocols and implementations |
quartermaster-engine |
Flow execution, traversal, memory, streaming |
quartermaster-mcp-client |
MCP protocol client (standalone) |
quartermaster-code-runner |
Docker sandboxed code execution (standalone) |
Documentation
See the docs/ directory:
License
Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file quartermaster_sdk-0.2.0.tar.gz.
File metadata
- Download URL: quartermaster_sdk-0.2.0.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2f072b268045c4bfe80f68e98c9d52cedc8118daa970f2c21d141b43456f5509
|
|
| MD5 |
38c1bd3504e59ec84acf76f9fdf2e182
|
|
| BLAKE2b-256 |
dd86b177dd343e8548322907bef6d66ddb633994639f90e1e2960fe2e238db22
|
File details
Details for the file quartermaster_sdk-0.2.0-py3-none-any.whl.
File metadata
- Download URL: quartermaster_sdk-0.2.0-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c05d6fd44b7b43f33e1a67e2a2dad985177a27e6d355828ec83a437b3e10d9aa
|
|
| MD5 |
4ecf54de5c48cf5bb176e84cd42268ad
|
|
| BLAKE2b-256 |
4c7cf150cd0f84c3b5ae61bad95fcc765e758ccb0d81b47106f9878dd9d9c38a
|