Curated Linux environments for ephemeral Firecracker-backed VM execution.
Project description
pyro-mcp
pyro-mcp runs commands inside ephemeral Firecracker microVMs using curated Linux environments such as debian:12.
It exposes the same runtime in three public forms:
- the
pyroCLI - the Python SDK via
from pyro_mcp import Pyro - an MCP server so LLM clients can call VM tools directly
Start Here
- Install: docs/install.md
- Host requirements: docs/host-requirements.md
- Integration targets: docs/integrations.md
- Public contract: docs/public-contract.md
- Troubleshooting: docs/troubleshooting.md
Public UX
Primary install/run path:
uvx --from pyro-mcp pyro mcp serve
Installed package path:
pyro mcp serve
The public user-facing interface is pyro and Pyro.
Makefile targets are contributor conveniences for this repository and are not the primary product UX.
Official Environments
Current official environments in the shipped catalog:
debian:12debian:12-basedebian:12-build
The package ships the embedded Firecracker runtime and a package-controlled environment catalog.
Official environments are pulled as OCI artifacts from public Docker Hub repositories into a local
cache on first use or through pyro env pull.
End users do not need registry credentials to pull or run official environments.
CLI
List available environments:
pyro env list
Prefetch one environment:
pyro env pull debian:12
Run one command in an ephemeral VM:
pyro run debian:12 --vcpu-count 1 --mem-mib 1024 -- git --version
Run with outbound internet enabled:
pyro run debian:12 --vcpu-count 1 --mem-mib 1024 --network -- \
"git clone --depth 1 https://github.com/octocat/Hello-World.git hello-world && git -C hello-world rev-parse --is-inside-work-tree"
Show runtime and host diagnostics:
pyro doctor
Run the deterministic demo:
pyro demo
pyro demo --network
Run the Ollama demo:
ollama serve
ollama pull llama3.2:3b
pyro demo ollama
Python SDK
from pyro_mcp import Pyro
pyro = Pyro()
result = pyro.run_in_vm(
environment="debian:12",
command="git --version",
vcpu_count=1,
mem_mib=1024,
timeout_seconds=30,
network=False,
)
print(result["stdout"])
Lower-level lifecycle control remains available:
from pyro_mcp import Pyro
pyro = Pyro()
created = pyro.create_vm(
environment="debian:12",
vcpu_count=1,
mem_mib=1024,
ttl_seconds=600,
network=True,
)
vm_id = created["vm_id"]
pyro.start_vm(vm_id)
result = pyro.exec_vm(vm_id, command="git --version", timeout_seconds=30)
print(result["stdout"])
Environment management is also available through the SDK:
from pyro_mcp import Pyro
pyro = Pyro()
print(pyro.list_environments())
print(pyro.inspect_environment("debian:12"))
MCP Tools
Primary agent-facing tool:
vm_run(environment, command, vcpu_count, mem_mib, timeout_seconds=30, ttl_seconds=600, network=false)
Advanced lifecycle tools:
vm_list_environments()vm_create(environment, vcpu_count, mem_mib, ttl_seconds=600, network=false)vm_start(vm_id)vm_exec(vm_id, command, timeout_seconds=30)vm_stop(vm_id)vm_delete(vm_id)vm_status(vm_id)vm_network_info(vm_id)vm_reap_expired()
Integration Examples
- Python one-shot SDK example: examples/python_run.py
- Python lifecycle example: examples/python_lifecycle.py
- MCP client config example: examples/mcp_client_config.md
- Claude Desktop MCP config: examples/claude_desktop_mcp_config.json
- Cursor MCP config: examples/cursor_mcp_config.json
- OpenAI Responses API example: examples/openai_responses_vm_run.py
- LangChain wrapper example: examples/langchain_vm_run.py
- Agent-ready
vm_runexample: examples/agent_vm_run.py
Runtime
The package ships an embedded Linux x86_64 runtime payload with:
- Firecracker
- Jailer
- guest agent
- runtime manifest and diagnostics
No system Firecracker installation is required.
pyro installs curated environments into a local cache and reports their status through pyro env inspect and pyro doctor.
Contributor Workflow
For work inside this repository:
make help
make setup
make check
make dist-check
Contributor runtime source artifacts are still maintained under src/pyro_mcp/runtime_bundle/ and runtime_sources/.
Official environment publication is automated through
.github/workflows/publish-environments.yml.
For a local publish against Docker Hub:
export DOCKERHUB_USERNAME='your-dockerhub-username'
export DOCKERHUB_TOKEN='your-dockerhub-token'
make runtime-materialize
make runtime-publish-official-environments-oci
make runtime-publish-environment-oci auto-exports the OCI layout for the selected
environment if it is missing.
The publisher accepts either DOCKERHUB_USERNAME and DOCKERHUB_TOKEN or
OCI_REGISTRY_USERNAME and OCI_REGISTRY_PASSWORD.
Docker Hub uploads are chunked by default for large rootfs layers; if you need to tune a slow
link, use PYRO_OCI_UPLOAD_TIMEOUT_SECONDS, PYRO_OCI_UPLOAD_CHUNK_SIZE_BYTES, and
PYRO_OCI_REQUEST_TIMEOUT_SECONDS.
For a local PyPI publish:
export TWINE_PASSWORD='pypi-...'
make pypi-publish
make pypi-publish defaults TWINE_USERNAME to __token__.
Set PYPI_REPOSITORY_URL=https://test.pypi.org/legacy/ to publish to TestPyPI instead.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pyro_mcp-1.0.0.tar.gz.
File metadata
- Download URL: pyro_mcp-1.0.0.tar.gz
- Upload date:
- Size: 2.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2acf5896999b873c5305cffdccc43f8209e8ba3064c2f814b29b0e7665e83844
|
|
| MD5 |
bc8de96d42720eac1744de3d30be85db
|
|
| BLAKE2b-256 |
2bc565171d74022fd1362d3f89924cf54ad3bd9aaf75ede6255c4d04ad262ed4
|
File details
Details for the file pyro_mcp-1.0.0-py3-none-any.whl.
File metadata
- Download URL: pyro_mcp-1.0.0-py3-none-any.whl
- Upload date:
- Size: 2.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2c6117392ddcfc0085cf14f34a1a7cd43fce03d0075fe385eb2e3a0bfb537042
|
|
| MD5 |
de9bef45ca1f70d71159961ea7b0deb9
|
|
| BLAKE2b-256 |
2482813daec8f40c4cb540389a79ddbf02e6461877a9e55feee73177319deed8
|