A small CLI that lets Claude Code reach non-Anthropic models — Ollama, Perplexity, OpenAI, Gemini — through a subagent that knows how to call it.
Project description
🐉 Dragoman
Give Claude Code access to every other AI model you pay for.
You use Claude Code. You also pay for Perplexity, OpenAI, Gemini, or run Ollama locally. Right now Claude can't reach any of them. Dragoman fixes that — it plugs into Claude Code's sub-agent system and lets it route questions to the right model, automatically.
What does that look like?
You're in Claude Code and ask: "What happened in the news today about the OpenAI acquisition?"
Instead of "I don't have web access," Claude recognizes this is a search question, hands it to Dragoman, and Dragoman routes it to Perplexity — which was built for exactly that. The answer comes back into your Claude session with sources.
More things you can do:
- 🔍 Search the web — "Find me sources on X," "what's the latest on Y" → routes to Perplexity
- 🧠 Ask another model — "What would GPT-5 say?" or "Try Gemini on this" → routes to that provider
- 🏠 Stay local — "Run this through Llama on my machine" → routes to Ollama, nothing leaves your network
- 🔀 Fan out — Send the same question to four models and have Claude synthesize the best answer
- 🎯 Upgrade selectively — Coding in Sonnet, but need Opus + GPT-5 for a deep research question? Just ask
Example — multi-model editing pipeline in one prompt:
"Take my draft. Send it to Opus for a deep prose edit. Then send the result to o3 for structural consistency. Then send that to Perplexity for citations and prior art. Apply fixes after each step."
Claude chains three sub-agents in sequence — each calls the right model via Dragoman, applies the feedback, and passes the improved draft to the next. One prompt, three models, fully automated.
Install
# Homebrew (macOS)
brew install asakin/tap/dragoman
# uv (fastest)
uv tool install dragoman-ai
# pipx
pipx install dragoman-ai
Then run the setup wizard:
dragoman init
It walks you through connecting your providers — API keys, Ollama hosts, whatever you use — and installs the sub-agent persona into Claude Code. Open a fresh Claude Code session and try: "What's the latest news on [topic]?"
Your keys never leave your machine
Dragoman resolves API keys at call time from wherever you already keep them, uses each key for one request, and throws it away. The key never enters Claude's context window.
# ~/.config/dragoman/config.toml
[providers.perplexity]
type = "openai_compat"
host = "https://api.perplexity.ai"
api_key = "op://Personal/Perplexity/credential" # 1Password CLI
[providers.openai]
type = "openai_compat"
host = "https://api.openai.com/v1"
api_key = "keychain://openai/apikey" # macOS Keychain
[providers.ollama]
type = "openai_compat"
host = "http://localhost:11434/v1" # no key needed
Supported secret backends: 1Password CLI (op://), macOS Keychain (keychain://), and environment variables (env:).
This isn't a hack — it's how sub-agents are supposed to work
Anthropic built a genuinely good sub-agent architecture into Claude Code. Sub-agents get their own context, their own persona, and their own mission — completely separated from the parent. Dragoman is just one example of what that architecture makes possible with very little code.
You don't need a plugin system that doesn't exist yet. You don't need to reverse-engineer anything. And when Anthropic improves their sub-agent runtime — better fan-out, richer permissions, longer context — Dragoman gets better for free.
What it writes
| What | Where | Created by | Removed by |
|---|---|---|---|
| Provider config | ~/.config/dragoman/config.toml |
dragoman init |
dragoman uninstall --purge-config |
| Sub-agent files | ~/.claude/agents/dragoman/ |
dragoman init |
dragoman uninstall |
| Persona block | ~/.claude/CLAUDE.md (inline, between <!-- BEGIN DRAGOMAN --> markers) |
dragoman init |
dragoman uninstall |
Clean removal: dragoman uninstall reverses everything. Add --purge-config to also delete your provider config.
Platform support
| Status | |
|---|---|
| macOS | ✅ Fully supported |
| Linux | ✅ Works — keychain:// is macOS-only, but op:// and env: work everywhere |
| Windows | ❓ Untested — PRs welcome |
Telemetry
None. Dragoman makes no outbound calls of its own — only to the provider endpoints you configured.
Contributing
See CONTRIBUTING.md. Small PRs welcome; commits need DCO sign-off (git commit -s). Contributor Covenant applies.
License
Apache 2.0 — see LICENSE.
Dragoman was the translator-fixer at Ottoman, Levantine, and European courts. The English word and the Hebrew meturgeman share an Akkadian root and have nothing to do with reptiles. The 🐉 emoji is a typo I refuse to fix. There is also no evidence that dragons were actually reptiles.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dragoman_ai-0.6.2.tar.gz.
File metadata
- Download URL: dragoman_ai-0.6.2.tar.gz
- Upload date:
- Size: 45.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a8276e94390b3c16626660493c800943d05326fa6f61ddae0c0cf79a9f244664
|
|
| MD5 |
eea6e3201ee9bac494206e07001bfee1
|
|
| BLAKE2b-256 |
4ad1224fc8440c65edbd3eac7c274537b5ffca51997a49c60d9dcc44610ba653
|
File details
Details for the file dragoman_ai-0.6.2-py3-none-any.whl.
File metadata
- Download URL: dragoman_ai-0.6.2-py3-none-any.whl
- Upload date:
- Size: 47.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
85988076ed7e353cda4bc7df02d949ca7eb87250909208e6f0a7127f0f6d15ff
|
|
| MD5 |
61fa8d4b19d639ad3c0a4394202caa4e
|
|
| BLAKE2b-256 |
1759c49c35bd5b2247be50396bc0d19e0d99490a7203aa559d05c0c962b5dc67
|