model-compose: Declarative AI Workflow Orchestrator
Project description
🤖 Model-Compose
model-compose is a declarative AI workflow orchestrator inspired by docker-compose. It lets you define and run AI model pipelines using simple YAML files — no custom code required. Effortlessly connect external AI services (OpenAI, Anthropic, Google, etc.), run local AI models, integrate vector stores, and more — all within powerful, composable workflows.
No custom code. Just YAML configuration.
✨ Features
🎨 No-Code AI Orchestration
Define complex AI workflows entirely in YAML—no Python, no JavaScript, no coding required. Connect multiple AI services, models, and APIs through simple declarative configuration.
🔗 Universal AI Service Integration
Connect to any AI provider out of the box—OpenAI, Anthropic Claude, Google Gemini, ElevenLabs, Stability AI, Replicate, or any custom HTTP API. Mix and match services in a single workflow.
🤖 Agent Components
Build autonomous AI agents that use workflows as tools. Agents can reason, plan, and execute multi-step tasks by dynamically invoking other workflows—all defined declaratively in YAML.
✋ Human-in-the-Loop
Add approval gates and user input steps to any workflow with interrupt configuration. Workflows pause, prompt for human input via CLI, Web UI, or API, and resume seamlessly—perfect for review, moderation, and supervised AI pipelines.
🖥️ Local Model Execution
Run models from HuggingFace and other sources locally with native support for transformers, PyTorch, and model serving frameworks. Fine-tune models with LoRA/PEFT, train with custom datasets, all through YAML configuration.
⚡ Real-Time Streaming
Built-in SSE (Server-Sent Events) streaming for real-time AI responses. Stream from OpenAI, Claude, local models, or any streaming API with automatic chunking and connection management.
🔄 Advanced Workflow Composition
Build multi-step pipelines with conditional logic, data transformation, and parallel execution. Pass data between jobs with powerful variable binding—${input}, ${response}, ${env}, with type conversion and defaults.
🚀 Production-Ready Deployment
Deploy as HTTP REST API or MCP (Model Context Protocol) server by changing one line. Includes concurrency control, health checks, and automatic API documentation.
🎯 Event-Driven Architecture
HTTP Callback listeners for async workflows (image generation, video processing). HTTP Trigger listeners for webhooks and external events. Build reactive AI systems that respond to real-world events.
🌐 Smart Tunneling & Gateways
Expose local services to the internet instantly with ngrok, Cloudflare, or SSH tunnels. Perfect for webhook integration and public API deployment without complex networking.
🐳 Container-Native Deployment
First-class Docker support with runtime configuration, volume mounting, and environment management. Deploy to any cloud provider or Kubernetes cluster with minimal configuration.
🎨 Instant Web UI
Add a visual interface with just 2 lines—get Gradio-powered chat UI or serve custom static frontends. Test workflows, monitor executions, and debug pipelines visually.
🗄️ RAG & Vector Database Ready
Native integration with ChromaDB, Milvus, Pinecone, and Weaviate. Build retrieval-augmented generation (RAG) systems with embedding search, document indexing, and semantic retrieval.
🔧 Flexible Component System
Reusable components with multi-action support. Define once, use everywhere. Mix HTTP clients, local models, vector stores, shell commands, and custom workflows in any combination.
📦 Installation
pip install model-compose
Or install from source:
git clone https://github.com/hanyeol/model-compose.git
cd model-compose
pip install -e .
Requires: Python 3.9 or higher
🚀 Quick Start
Create a model-compose.yml:
controller:
type: http-server
port: 8080
webui:
port: 8081
workflows:
- id: chat
default: true
jobs:
- component: chatgpt
components:
- id: chatgpt
type: http-client
base_url: https://api.openai.com/v1
path: /chat/completions
method: POST
headers:
Authorization: Bearer ${env.OPENAI_API_KEY}
body:
model: gpt-4o
messages:
- role: user
content: ${input.prompt}
Create a .env file:
OPENAI_API_KEY=your-key
Run it:
model-compose up
Your API is now live at http://localhost:8080 and Web UI at http://localhost:8081 🎉
🎯 Powerful Yet Simple
🖥️ Add Web UI with 2 Lines
controller:
webui:
port: 8081
🛰️ Switch to MCP Server with 1 Line
controller:
type: mcp-server
🔄 Run Components in Separate Processes
component:
runtime: process
🐳 Deploy in Docker with 1 Line
controller:
runtime: docker
💡 Explore examples for more workflows or read the User Guide.
🏗 Architecture
🤝 Contributing
We welcome all contributions! Whether it's fixing bugs, improving docs, or adding examples — every bit helps.
# Setup for development
git clone https://github.com/hanyeol/model-compose.git
cd model-compose
pip install -e .[dev]
📄 License
MIT License © 2025-2026 Hanyeol Cho.
📬 Contact
Have questions, ideas, or feedback? Open an issue or start a discussion on GitHub Discussions.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file model_compose-0.4.29.tar.gz.
File metadata
- Download URL: model_compose-0.4.29.tar.gz
- Upload date:
- Size: 159.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6298400217c7059b5192611a7d1859216fbdecf1b8d6c9276499fc9ac027be2b
|
|
| MD5 |
73c323dd438baa4ebe35d17b702849e3
|
|
| BLAKE2b-256 |
90ed06615f492fa4234526d9010237dfa57770ae49a0fbe98a20d1d080d87790
|
File details
Details for the file model_compose-0.4.29-py3-none-any.whl.
File metadata
- Download URL: model_compose-0.4.29-py3-none-any.whl
- Upload date:
- Size: 298.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6ba35dd60c23ad81fd2344efb875d1679d8771374d86dfd42c085a334d3936b2
|
|
| MD5 |
5f7a9c239b3830b6c7cf5dba53cf5518
|
|
| BLAKE2b-256 |
2248a124ccf88d32a6e8dcd288369040f86e76c8c5944da3fcd53aca30b97f93
|