Skip to main content

13 specialist AI agents for your coding assistant — security audits, test generation, infrastructure, and more. MCP-native for Claude Code, Codex, and any MCP-compatible client.

Project description

Vending Machine MCP Server

13 specialist AI agents for your coding assistant. Use them from Claude Code, Codex, or any MCP-compatible client.

Scope

This document describes the **vending-machine-mcp** PyPI package: a standalone stdio MCP server. It ships complete and does not depend on any other repository layout.

Source of truth for tools and agents in this package:

  • vending_machine_mcp/mcp_server.py

Tools (Current Signatures)

Tool Signature What it does
list_agents list_agents() Lists available agent IDs, model, and description
hire_agent hire_agent(agent_id: str, task: str) Starts an async agent job and returns job_id
check_job check_job(job_id: str) Polls async job status/results
search_scripts search_scripts(query: str) Finds scripts by natural-language query
run run(vending_code: str, user_input: str) Runs a marketplace script async, returns job_id
info info(vending_code: str) Shows script details, validation info, usage, similar scripts
run_code run_code(code: str, language: str = "python", packages: str = "") Runs Python/JS code in sandbox
audit_code audit_code(code: str) Async workflow: Bug Hunter -> Test Goblin -> Code Gremlin
vector_store_add vector_store_add(collection, text, doc_id="", metadata_json="{}", provider="", model="", dimensions="") Embeds text (BYOK) and stores in local SQLite. Optional provider/model/dimensions override.
vector_store_search vector_store_search(collection, query, limit=5, provider="", model="", dimensions="") Semantic search (auto-detects provider/model from collection)
vector_store_list vector_store_list(collection="", limit=50) Lists collections, or documents inside one collection
query_knowledge query_knowledge(collection, question, limit=5, provider="", model="", dimensions="") RAG: retrieves chunks then LLM answers. Auto-detects embedding config.

Async Behavior

  • hire_agent, run, and audit_code return a job_id.
  • Use check_job(job_id) to retrieve final output.

Agents (Current Registry)

Agent ID Name What it does
bug-hunter Bug Hunter Code review, debugging, and security audits
cloud-sensei Cloud Sensei AWS architecture review, troubleshooting, cost optimization
code-gremlin Code Gremlin Code writing, debugging, and refactoring
data-sprite Data Sprite DB schemas, migrations, seed data, ETL scripts
desk-pilot Desk Pilot Admin support: email drafts, SOPs, meeting/task prep
devops-dwarf DevOps Dwarf Docker, Terraform, CI/CD, Kubernetes, deployment assets
embeddings-agent Embeddings Agent Embeds text, persists vectors to the local store, and analyzes similarity. After the job finishes, query with query_knowledge. Needs one of: OpenAI, Voyage AI, or Gemini keys (see Requirements)
inbox-zero Inbox Zero Email triage, categorization, prioritization, draft replies
mcp-maker MCP Maker MCP server scaffolding and tool definitions
number-crunch Number Crunch Data analysis, trends, KPI reporting
pdf-forge PDF Forge Structured PDF/report generation
test-goblin Test Goblin Unit/integration/e2e test generation
vibe-writer Vibe Writer Content drafting: posts, newsletters, messaging

Quick Start

pip install vending-machine-mcp

Claude Code / Codex / Any MCP Client

{
  "mcpServers": {
    "vending-machine": {
      "command": "vending-machine-mcp"
    }
  }
}

Usage

> hire_agent("bug-hunter", "Review this auth middleware for vulnerabilities: <paste code>")
> check_job("abc123")
> run("S-7K2M", "input payload")
> audit_code("<paste code>")
> vector_store_add("my-kb", "Full note text...", metadata_json='{"source":"notes.md"}')
> vector_store_search("my-kb", "How do we deploy?")
> query_knowledge("my-kb", "Summarize the deployment process")

Embed and query flow (embeddings-agent + query_knowledge)

> hire_agent("embeddings-agent", "Embed my API docs: <paste text>")
> check_job("job-id")
  -> "Stored 24 chunks in collection `embeddings-api-docs-txt`.
      Query with: query_knowledge(collection='embeddings-api-docs-txt', question='...')"
> query_knowledge("embeddings-api-docs-txt", "How does authentication work?")
  -> "Based on the documents: Authentication uses JWT tokens issued by... [1] [3]"

What works out of the box

After pip install and pointing your MCP client at vending-machine-mcp, you do not need a separate database or vector service. Several tools are useful with no API keys at all:

  • **list_agents** — lists agents (starting a run still needs OpenRouter; see Requirements).
  • **search_scripts** — natural-language search over bundled marketplace scripts using a local SQLite index and deterministic local embeddings when no embedding provider key is set. Results are real and ranked; quality is better if you set **VOYAGE_API_KEY** (or OpenAI / Gemini — same priority as below).
  • **info** — script details from the in-memory store; similar scripts / chain hints use the same SQLite index when available.
  • **vector_store_add / vector_store_search / vector_store_list** — personal collections in another local SQLite file; same local hash fallback without keys.
  • **query_knowledge** — RAG over any collection: retrieves context chunks then asks an LLM to answer. Requires at least one LLM key (**OPENROUTER_API_KEY, **OPENAI_API_KEY, or **GEMINI_API_KEY**) for the answer step; embedding search itself works with local hash when no embedding key is set.

Anything that calls models (hire_agent, check_job results, audit_code) needs **OPENROUTER_API_KEY. Anything that runs code (run, run_code, sandbox paths inside agents) needs a sandbox key (DAYTONA_API_KEY or E2B_API_KEY). The **embeddings-agent expects at least one embedding provider key (see Requirements). No extra setup steps beyond env vars and MCP config.

Marketplace script search (search_scripts)

  • Implemented in vending_machine_mcp/graph.py: a SQLite script index with semantic search over embedded script text. No external graph or vector database service is required.
  • Default DB: ~/.local/share/vending-machine-mcp/script_index.sqlite3 (or $XDG_DATA_HOME/vending-machine-mcp/…). Override with **SCRIPT_INDEX_DB_PATH**.
  • Embeddings for indexing and queries use Voyage voyage-code-3 (document/query types) when **VOYAGE_API_KEY** is set; otherwise OpenAIGemini → deterministic local hash, same idea as the vector store. Only rows whose stored vector length matches the query embedding are ranked (avoid mixing providers without re-seeding or clearing the DB).

Local vector store

  • Default DB path: ~/.local/share/vending-machine-mcp/vector_store.sqlite3 (or $XDG_DATA_HOME/vending-machine-mcp/…). Override with **VECTOR_STORE_DB**.
  • Uses the same embedding key priority as vector_store_add: Voyage → OpenAI → Gemini → deterministic local hash. Override with **EMBEDDING_PROVIDER** / **EMBEDDING_MODEL** / **EMBEDDING_DIMENSIONS** env vars, or pass provider/model/dimensions directly to each tool call.
  • The provider and model used are stored per-document. When searching or querying, the tool auto-detects which provider/model the collection uses and embeds the query the same way.

OPENAI_API_KEY: what it does (and what it does not)

OPENAI_API_KEY is for OpenAI’s Embeddings API (text-embedding-3-small) wherever the package needs vectors. It is not a substitute for **OPENROUTER_API_KEY, which is what runs the LLM inside agents (including **embeddings-agent).

With **OPENAI_API_KEY** set (and no **VOYAGE_API_KEY**, which takes priority):

  • **search_scripts** — better semantic search over the marketplace script index (same key used when seeding/indexing scripts).
  • **vector_store_add / vector_store_search** — embeddings for your personal collections (plus **vector_store_list** to inspect them).

There is no MCP tool that runs raw SQL. Your chat app talks to the two local SQLite files only through tools: use **search_scripts** for bundled scripts, and **vector_store_search** / **query_knowledge** / **vector_store_list** for data you added with **vector_store_add** or the **embeddings-agent**.

**embeddings-agent**: call **hire_agent("embeddings-agent", …)** then **check_job**. The agent now persists all chunks + vectors into the vector store under a named collection. Use **query_knowledge(collection, question)** afterward to ask questions about that data. The agent graph needs **OPENROUTER_API_KEY; the embedding steps need at least one embedding key (**OPENAI_API_KEY / Voyage / Gemini); **query_knowledge** needs at least one LLM key (OpenRouter, OpenAI, or Gemini) for the answer step.

Requirements

  • Python 3.11+
  • **OPENROUTER_API_KEY** — required to run agents (hire_agent, audit_code, and any tool that executes an LLM workflow). Not required only to list agents or to use search / vector store without agents.
  • Sandbox — required for **run, **run_code, and code execution inside agents: set **DAYTONA_API_KEY** or **E2B_API_KEY**. The server picks a provider when both exist; set **SANDBOX_PROVIDER=daytona** or **SANDBOX_PROVIDER=e2b** to force one.
  • **embeddings-agent** — same as other agents: **OPENROUTER_API_KEY** is required to run it via **hire_agent. For embedding operations inside that agent, use at least one of **OPENAI_API_KEY, **VOYAGE_API_KEY**, or **GEMINI_API_KEY** (**GOOGLE_API_KEY** is an alias for Gemini).

Optional for better (not required for working) semantic search and vector store without using **embeddings-agent**: the same embedding keys. Priority is Voyage → OpenAI → Gemini → local hash (see sections above).

Environment variables can be set in the shell that launches the MCP server or in a **.env** file in the server’s working directory (the package loads it on startup).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vending_machine_mcp-0.2.1.tar.gz (108.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vending_machine_mcp-0.2.1-py3-none-any.whl (122.4 kB view details)

Uploaded Python 3

File details

Details for the file vending_machine_mcp-0.2.1.tar.gz.

File metadata

  • Download URL: vending_machine_mcp-0.2.1.tar.gz
  • Upload date:
  • Size: 108.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.1

File hashes

Hashes for vending_machine_mcp-0.2.1.tar.gz
Algorithm Hash digest
SHA256 4ae9d0b9f02c8c41864fe1d57f46501a263ea608d28132df65e5c4f88a752006
MD5 5bd92280e9e5f3c6542148d1c4d2146e
BLAKE2b-256 2a3b634ee31ab1957cfd5cf06c31d27f50b816d505e7703f011bdb75577020d7

See more details on using hashes here.

File details

Details for the file vending_machine_mcp-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for vending_machine_mcp-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1b86adeab38d7937f60268edb82cba2f41766b638eba5a22ffbfdb3659b1439e
MD5 e2483b49fe7ccf996117b05c9a3cfee8
BLAKE2b-256 ba1fa916af2e221943ead0c7abce7974f498a5624d9c2e394258188843c90906

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page