Debuggable runtime for AI agent pipelines
Project description
Binex
Open-source visual orchestrator for AI agent workflows
Build, run, debug, and replay multi-agent pipelines — 100% locally.
Demo
Full workflow: drag & drop nodes, configure models and prompts, run with human input, see results, debug, trace, and lineage — all in the browser.
What is Binex?
Binex is an open-source, fully local runtime for AI agent workflows. No cloud. No telemetry. No vendor lock-in.
pip install binex
binex ui
That's it. Browser opens. You're building AI workflows.
Why Binex?
- 100% local — your data never leaves your machine
- 100% open source — MIT licensed, audit every line
- Zero telemetry — no tracking, no analytics, no surprises
- Full debuggability — every input, output, prompt, and cost is visible
- Any model — OpenAI, Anthropic, Google, Ollama, OpenRouter, DeepSeek, and 40+ more via LiteLLM
Installation
pip install binex
With extras:
pip install binex[langchain] # LangChain Runnables
pip install binex[crewai] # CrewAI Crews
pip install binex[autogen] # AutoGen Teams
pip install binex[telemetry] # OpenTelemetry tracing
pip install binex[rich] # Rich colored CLI output
Web UI
Launch the visual workflow editor:
binex ui
Visual Drag & Drop Editor
Build workflows visually — drag nodes from the palette, connect them, configure models and prompts inline.
6 node types: LLM Agent, Local Script, Human Input, Human Approve, Human Output, A2A Agent
- 20+ preset models including 8 free OpenRouter models
- Built-in prompt library (Planner, Researcher, Analyzer, Writer, Reviewer, Summarizer)
- Switch between Visual and YAML modes — changes sync both ways
- Real-time cost estimation as you build
- Custom model input — use any litellm-compatible model
18 Pages — Full CLI Parity
| Category | Pages |
|---|---|
| Workflows | Browse, Visual Editor, Scaffold Wizard |
| Runs | Dashboard, RunLive (SSE), RunDetail |
| Analysis | Debug (input/output artifacts), Trace (Gantt timeline), Diagnose (root-cause), Lineage (artifact graph) |
| Comparison | Diff (side-by-side), Bisect (find divergence) |
| Costs | Cost Dashboard (charts), Budget Management |
| System | Doctor (health), Plugins, Gateway, Export |
Replay
Debug any node → click Replay → swap the model or prompt → re-run just that node. No re-running the entire pipeline.
Quickstart
CLI
# Zero-config demo
binex hello
# Run a workflow
binex run examples/simple.yaml
# Inspect the run
binex debug latest
binex trace latest
Web UI
binex ui
Create a Workflow
name: research-pipeline
nodes:
input:
agent: "human://input"
outputs: [output]
planner:
agent: "llm://gemini/gemini-2.5-flash"
system_prompt: "Break this topic into research questions"
depends_on: [input]
outputs: [output]
researcher:
agent: "llm://openrouter/google/gemma-3-27b-it:free"
system_prompt: "Investigate and report findings"
depends_on: [planner]
outputs: [output]
output:
agent: "human://output"
depends_on: [researcher]
outputs: [output]
Features
Agent Adapters
| Prefix | Description |
|---|---|
local:// |
In-process Python callable |
llm:// |
LLM via LiteLLM (40+ providers) |
a2a:// |
Remote agent via A2A protocol |
human://input |
Free-text input from user |
human://approve |
Approval gate with conditional branching |
human://output |
Display results to user |
langchain:// |
LangChain Runnable (plugin) |
crewai:// |
CrewAI Crew (plugin) |
autogen:// |
AutoGen Team (plugin) |
CLI Commands
| Command | Description |
|---|---|
binex run |
Execute a workflow |
binex ui |
Launch Web UI |
binex debug |
Post-mortem inspection |
binex trace |
Execution timeline |
binex replay |
Re-run with agent swaps |
binex diff |
Compare two runs |
binex cost show |
Cost breakdown per node |
binex explore |
Interactive TUI dashboard |
binex scaffold |
Generate workflow from DSL |
binex export |
Export to CSV/JSON |
binex doctor |
System health check |
binex hello |
Zero-config demo |
LLM Providers
OpenAI · Anthropic · Google Gemini · Ollama · OpenRouter · Groq · Mistral · DeepSeek · Together AI
Built With
Examples
| Example | What it demonstrates |
|---|---|
simple.yaml |
Minimal two-node pipeline |
diamond.yaml |
Diamond dependency pattern |
fan-out-fan-in.yaml |
Parallel execution with aggregation |
human-in-the-loop.yaml |
Approval gates and conditional branching |
multi-provider-demo.yaml |
Multiple LLM providers in one workflow |
ollama-research.yaml |
Full research pipeline with Ollama + OpenRouter |
langchain-summarizer.yaml |
LangChain Runnable in a pipeline |
crewai-research-crew.yaml |
CrewAI Crew as a workflow node |
autogen-coding-team.yaml |
AutoGen Team for code generation |
Architecture
src/binex/
├── adapters/ # Agent backends (local, LLM, A2A, human, frameworks)
├── cli/ # Click CLI commands
├── graph/ # DAG construction + topological scheduling
├── models/ # Pydantic v2 domain models
├── plugins/ # Plugin registry for custom adapters
├── prompts/ # 121 built-in prompt templates
├── runtime/ # Orchestrator, dispatcher, replay engine
├── stores/ # SQLite execution + filesystem artifacts
├── trace/ # Debug, lineage, timeline, diffing
├── ui/ # FastAPI backend + React frontend
│ ├── api/ # 20 REST endpoints
│ └── static/ # Pre-built React app
└── workflow_spec/ # YAML loader + validator
Documentation
Full docs at alexli18.github.io/binex
Contributing
Contributions are welcome! If you find this useful:
- Star the repo — it takes 1 second and helps more than you know
- Open issues — tell me what's broken or what you need
- Submit PRs — let's build this together
I'm a solo developer building this in the open. Every star, issue, and PR makes a real difference.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
License
Distributed under the MIT License. See LICENSE for more information.
No cloud. No telemetry. No surprises. Just debuggable AI workflows.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file binex-0.5.1-py3-none-any.whl.
File metadata
- Download URL: binex-0.5.1-py3-none-any.whl
- Upload date:
- Size: 1.0 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3586fe33eecfbc572996b8551956df6fc5f7698a211837ed286c16d162838f2f
|
|
| MD5 |
de2e1a41bbe0ea93e5df11eb65debb0b
|
|
| BLAKE2b-256 |
6aa25e1adf5429863fdaeef6970c93e7dec7a9d6a73cca379e52195a4f8afb98
|