A small CLI that lets Claude Code reach non-Anthropic models — Ollama, Perplexity, OpenAI, Gemini — through a subagent that knows how to call it.
Project description
dragoman
A small CLI that lets Claude Code reach non-Anthropic models — Ollama (local), Perplexity (search-augmented), OpenAI, Gemini, anything OpenAI-compatible — through one verb the existing subagent runtime can call.
I have a GPU running Ollama. I pay for OpenAI, Gemini, and Perplexity because each is the right answer for a different shape of question. Claude Code is the conductor. Dragoman is the verb that lets the conductor talk to the rest of the orchestra.
v0.6.1 alpha. Apache 2.0.
What it does
Two commands. No agent loop. No tool execution. No shell. Just one HTTPS call per ask.
dragoman ask --model perplexity:sonar-pro --prompt "..." # one HTTPS call, prints text
dragoman models # what's configured, one per line
The persona injected by dragoman init teaches Claude Code when to spawn a real Task() subagent that uses dragoman ask for the cognitive step. All filesystem and shell work happens through Claude Code's normal tools — the harness's audit, fan-out, and permissions stay intact. Dragoman holds keys; the harness holds the runtime.
Install
You can install Dragoman securely using any modern Python tool manager:
# Using Homebrew (macOS/Linux)
brew install asakin/tap/dragoman
# Using uv (Fastest)
uv tool install dragoman-ai
# Using pipx (Standard)
pipx install dragoman-ai
Then, initialize your config and provider keys:
dragoman init
Open a fresh Claude Code session. Try: "What's the best model for [your task]?" If the dragon shows up, it works.
Keys live where you already keep them
API keys can be literal strings, environment variables, or references resolved at call time:
[providers.perplexity_1]
type = "openai_compat"
host = "https://api.perplexity.ai"
api_key = "op://Personal/Perplexity/credential" # 1Password CLI
[providers.groq_1]
type = "openai_compat"
host = "https://api.groq.com/v1"
api_key = "keychain://groq/apikey" # macOS Keychain
Dragoman fetches by reference, uses the key for one HTTPS call, discards it. The key never enters Claude's context.
Multiple providers, multiple accounts
Dragoman replaces hardcoded singleton endpoints with a dynamic provider registry. You can connect as many distinct accounts, gateways, or local instances as you want simultaneously.
For example, if you run Ollama on a laptop and on a separate workstation reachable via Tailscale:
[providers.laptop_1]
type = "openai_compat"
host = "http://localhost:11434/v1"
[providers.workstation_1]
type = "openai_compat"
host = "http://workstation.tailnet.ts.net:11434/v1"
Then you simply tell Claude exactly which pipe to use: --model workstation_1:qwen2.5:72b. No magic network probing; just explicit, unopinionated routing.
What it writes to your system
| Artifact | Path | Created by | Removed by |
|---|---|---|---|
| Provider config | ~/.config/dragoman/config.toml |
dragoman init |
dragoman uninstall --purge-config |
| Persona block | ~/.claude/CLAUDE.md (or project) |
dragoman init |
dragoman uninstall |
| Python package | active env | pip install dragoman-ai |
pip uninstall dragoman-ai |
The persona block is bracketed by <!-- dragoman persona ... --> markers; removal is exact and idempotent.
Telemetry
None. Dragoman makes no outbound calls of its own — only to provider endpoints you configured.
Contributing
See CONTRIBUTING.md. Small PRs welcome; commits need DCO sign-off (git commit -s); Contributor Covenant applies.
License
Apache 2.0. See LICENSE.
Dragoman was the translator-fixer at Ottoman, Levantine, and European courts. The English word and the Hebrew meturgeman share an Akkadian root and have nothing to do with reptiles. The 🐉 emoji is a typo I refuse to fix. There is also no evidence that dragons were actually reptiles.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dragoman_ai-0.6.1.tar.gz.
File metadata
- Download URL: dragoman_ai-0.6.1.tar.gz
- Upload date:
- Size: 39.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
973fdd40eef53bd029be022590100f8cd80552f4db2b2d9fc8401a4ee6ef3eae
|
|
| MD5 |
3f5285ecf1df9627df59497cc45e3741
|
|
| BLAKE2b-256 |
2e8830a7a8e78c55bbf00a9c12a246d25091447b2b6dbdb710e6cf2dd113f2ec
|
File details
Details for the file dragoman_ai-0.6.1-py3-none-any.whl.
File metadata
- Download URL: dragoman_ai-0.6.1-py3-none-any.whl
- Upload date:
- Size: 43.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ffadd89d2460520ff1ed7f752fd01137c7f65f2cfd686edcf89fffb9e4920936
|
|
| MD5 |
76799acae05a9e3873ad41c04f0db98e
|
|
| BLAKE2b-256 |
8834dc15030c945c15b17fe9cddbfa0e4ba63049ee0f113fe960f435ea42a519
|