Ship-ready MCP server for Gemini + Imagen diagram/plot generation. Runs on Claude, ChatGPT, VSCode Copilot, Cursor.
Project description
figurify-mcp
Ship-ready MCP server for Gemini + Imagen diagram, infographic, plot, and evaluation generation. One codebase runs on Claude Code / Desktop, ChatGPT Apps SDK, VSCode GitHub Copilot, Cursor, and any MCP-compliant client.
- 5 task tools + 5 meta tools, 2 prompt templates, 3 resources
- Smart model routing (April 2026 Gemini catalog: Gemini 3 Flash/Pro, Nano Banana 2/Pro, Imagen 4 Fast/Ultra)
- Token-budget aware via
search_tools/describe_tool/recommend_model/estimate_cost - Stdio + Streamable HTTP transports from one binary
- 44 passing pytest cases, ruff-clean, MIT licensed, Docker-packaged, PyPI-ready
Inspired by llmsresearch/paperbanana; design patterns from jlowin/fastmcp, modelcontextprotocol/servers, github/github-mcp-server, Composio search_tools, and the Speakeasy progressive-disclosure pattern.
Tool catalog
Generation (5)
| Tool | Cost recipe | When Claude/GPT/Copilot picks it |
|---|---|---|
create_simple_diagram |
1× Imagen (~$0.04) | "draw X", "sketch Y", "quick picture of Z" |
create_infographic |
1× VLM + 1× Imagen (~$0.12) | "infographic", "poster", "visual summary of X" |
create_methodology_diagram |
1× VLM + N × (Imagen + critique) (~$0.25 at iter=2) | paper methodology text, NeurIPS-style figures |
create_statistical_plot |
1× VLM + matplotlib subprocess (~$0.002) | tabular data → chart |
evaluate_diagram |
1× VLM judge (~$0.015) | "compare / score / judge these two diagrams" |
Meta / token-savers (5)
| Tool | Purpose |
|---|---|
search_tools(query, tags?, limit=5) |
Rank slim tool cards — client avoids loading every schema upfront |
describe_tool(name) |
Full JSON schema for one tool (progressive disclosure) |
recommend_model(task, task_kind="auto") |
Maps free-form intent → routed Gemini model + ~USD/call + rationale |
estimate_cost(tool_name, iterations=2) |
Breakdown of VLM + image calls before you pay |
list_available_models() |
Full Gemini + Imagen catalog with cost / latency tiers |
Prompts (slash-command candidates)
draft_methodology_figure(methodology_text, caption)plan_visual_output(user_request)
Resources (static, readable without tool calls)
figurify://styles/methodology_guidelinesfigurify://styles/infographic_guidelinesfigurify://catalog/models
All generation tools return [ImageContent, TextContent] MCP content blocks so images render inline in Claude, ChatGPT, Copilot, Cursor — no base64 blobs leaking into prompts.
Smart model routing
# figurify_mcp/models.py
pick_image_model("simple") # → gemini-3.1-flash-image-preview (Nano Banana 2)
pick_image_model("simple", "4K") # → gemini-3-pro-image-preview (auto-upgrade)
pick_image_model("detailed") # → gemini-3-pro-image-preview (Nano Banana Pro)
pick_image_model("fast_draft") # → imagen-4.0-fast-generate-001
pick_image_model("photoreal") # → imagen-4.0-ultra-generate-001
pick_vlm_model("reasoning") # → gemini-3.1-pro-preview
pick_vlm_model("cheap_text") # → gemini-3.1-flash-lite-preview
pick_vlm_model("critic") # → gemini-3-flash
Pin a model globally via FIGURIFY_VLM_MODEL / FIGURIFY_IMAGE_MODEL — env overrides always win.
Installation
pip install figurify-mcp
# or
uv pip install figurify-mcp
Get a free API key at aistudio.google.com/apikey.
Host configs
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"figurify": {
"command": "uvx",
"args": ["figurify-mcp"],
"env": { "GOOGLE_API_KEY": "your-key-here" }
}
}
}
Restart Claude Desktop — the 10 tools appear automatically.
Claude Code / Cursor
claude mcp add figurify -s user \
-e GOOGLE_API_KEY=$GOOGLE_API_KEY \
-- uvx figurify-mcp
VSCode GitHub Copilot — .vscode/mcp.json
{
"servers": {
"figurify": {
"command": "uvx",
"args": ["figurify-mcp"],
"env": { "GOOGLE_API_KEY": "your-key-here" }
}
}
}
ChatGPT Apps SDK / remote HTTP
docker run --rm -p 8765:8765 \
-e GOOGLE_API_KEY=$GOOGLE_API_KEY \
ghcr.io/genaimanoj/figurify-mcp
# then register https://your-host/mcp in the Apps SDK console
Why
uvx? It downloads, caches, and runs the package in one command — no venv, no manual install. First run takes ~5 s; every subsequent run is instant from cache.
Transports
figurify-mcp --transport stdio # Claude Desktop / Code / Cursor (default)
figurify-mcp --transport http --host 0.0.0.0 --port 8765 # ChatGPT / VSCode remote / web
figurify-mcp --transport sse --port 8765 # legacy (deprecated per MCP 2025-11-25)
Dev setup
git clone https://github.com/genaimanoj/figurify-mcp
cd figurify-mcp
uv venv && source .venv/bin/activate
uv pip install -e '.[dev]'
cp .env.example .env # set GOOGLE_API_KEY
pre-commit install
pytest -q # 44 tests, ~2s
How token budget is saved
A host connecting to many MCP servers pays context tokens for every tool schema upfront. This server offers three levels of progressive disclosure:
- List-only mode — clients that only want one endpoint load
search_toolsand nothing else. When a user asks something, the client callssearch_tools("intent")and receives slim cards{name, score, description, tags, required_params}. Full schema is fetched only for the chosen tool viadescribe_tool(name). - Cost-aware routing — before committing to a generation, clients call
estimate_cost(tool_name)to see a per-model breakdown, orrecommend_model(intent)to pick the cheapest viable model. - Prompts & resources — common request patterns (methodology diagram, visual routing) are exposed as MCP Prompts so hosts surface them as slash commands without the client needing to compose templates.
Measured impact: a fresh host cold-start that lists search_tools alone loads ~0.5 KB of schema instead of ~6 KB for the full catalog — ~90% upfront reduction, with no loss of functionality.
Design decisions
- FastMCP 2.x over the raw SDK — 30% less boilerplate, built-in transports, prompts + resources primitives.
- MCP content blocks (
ImageContent+TextContent) — renders inline in every compliant host. Base64-in-a-dict is a common anti-pattern that fails silently in ChatGPT / Claude Desktop. fastmcp.exceptions.ToolError— surfaces asisError: trueper MCP spec so hosts show the user a real error instead of interpreting the exception string as output.pydantic-settings+.env— config is ordinary env vars;Settings.ensure_ready()fails fast at startup in stdio mode.- Typed params via
Annotated[..., Field(description=...)]+Literal[...]— hosts render dropdowns and range validators automatically. - Tags on every tool (
image,academic,fast,meta,routing…) — letssearch_toolsand host UIs filter. - Docstring convention — every tool starts with
**Use this when…** / **Do NOT use…**so LLM hosts route intent correctly.
Layout
FigurifyMCP/
├── pyproject.toml # PyPI metadata, ruff + mypy + pytest configs
├── Dockerfile # non-root, HTTP transport on :8765/mcp
├── LICENSE # MIT
├── CHANGELOG.md, CONTRIBUTING.md, SECURITY.md
├── .github/workflows/ci.yml # ruff + pytest on py3.10/3.11/3.12 + trusted PyPI publish
├── figurify_mcp/
│ ├── server.py # FastMCP app, CLI, 10 tools + 2 prompts + 3 resources
│ ├── meta.py # search_tools / describe_tool / recommend_model / estimate_cost
│ ├── models.py # Gemini registry + pick_vlm_model / pick_image_model
│ ├── config.py # pydantic-settings
│ ├── prompts.py # embedded VLM prompt templates
│ ├── utils.py # image_content, text_content, save_png, subprocess runner
│ ├── clients/
│ │ ├── imagen.py # google-genai image gen wrapper
│ │ └── gemini_vlm.py # google-genai text + vision wrapper
│ └── tools/
│ ├── simple.py ├── infographic.py
│ ├── methodology.py ├── plot.py
│ └── evaluate.py
└── tests/ # 44 hermetic tests — no network
├── test_registry.py ├── test_meta.py
├── test_server_contract.py └── test_utils.py
CI / release
.github/workflows/ci.yml— ruff lint + format check + pytest on 3.10 / 3.11 / 3.12.- Pushing a
v*tag triggers thepublishjob which builds wheels and uploads to PyPI via trusted publishing. - Dockerfile publishes non-root, adjustable via
--transportat container runtime.
License
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file figurify_mcp-0.2.0.tar.gz.
File metadata
- Download URL: figurify_mcp-0.2.0.tar.gz
- Upload date:
- Size: 35.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3ec67b079eb461bee074195b536353cbf3e6507285048acd92be67c13738863d
|
|
| MD5 |
2a643f5e8fd3b7dbe572f245c11d6bac
|
|
| BLAKE2b-256 |
a5320383e7afdc40aa1125af77375c42242f0405de8485c8397629b8e3d97e23
|
Provenance
The following attestation bundles were made for figurify_mcp-0.2.0.tar.gz:
Publisher:
ci.yml on genaimanoj/figurify-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
figurify_mcp-0.2.0.tar.gz -
Subject digest:
3ec67b079eb461bee074195b536353cbf3e6507285048acd92be67c13738863d - Sigstore transparency entry: 1322568428
- Sigstore integration time:
-
Permalink:
genaimanoj/figurify-mcp@ed57661490e5c28defbadbbf029ec0b318f84bd5 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/genaimanoj
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
ci.yml@ed57661490e5c28defbadbbf029ec0b318f84bd5 -
Trigger Event:
push
-
Statement type:
File details
Details for the file figurify_mcp-0.2.0-py3-none-any.whl.
File metadata
- Download URL: figurify_mcp-0.2.0-py3-none-any.whl
- Upload date:
- Size: 36.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
20f4033af91b4024824a4bffe748ce979e9748579c799d29d87be443eb8bd03c
|
|
| MD5 |
966c0e65b429761ec806b5760c479c2d
|
|
| BLAKE2b-256 |
479d226d4daa84c37a8556e650800399957eaf214f1ac53b8f1d8c924ff4ba85
|
Provenance
The following attestation bundles were made for figurify_mcp-0.2.0-py3-none-any.whl:
Publisher:
ci.yml on genaimanoj/figurify-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
figurify_mcp-0.2.0-py3-none-any.whl -
Subject digest:
20f4033af91b4024824a4bffe748ce979e9748579c799d29d87be443eb8bd03c - Sigstore transparency entry: 1322568546
- Sigstore integration time:
-
Permalink:
genaimanoj/figurify-mcp@ed57661490e5c28defbadbbf029ec0b318f84bd5 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/genaimanoj
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
ci.yml@ed57661490e5c28defbadbbf029ec0b318f84bd5 -
Trigger Event:
push
-
Statement type: