Convert any CMD tool into a LLM agent
Project description
coala-cli
======================
Overview
Coala, implemented as a Python package, is a standards-based framework for turning command-line tools into reproducible, agent-accessible toolsets that support natural-language interaction.
How the Framework Works
Coala integrates the Common Workflow Language (CWL) with the Model Context Protocol (MCP) to standardize tool execution. This approach allows Large Language Model (LLM) agents to discover and run tools through structured interfaces, while strictly enforcing the containerized environments and deterministic results necessary for reproducible science.
Core Components
- Client Layer: Any MCP-compliant client application (e.g., Claude Desktop, Cursor, or custom interfaces) that utilizes LLMs (such as Gemini, GPT-5, or Claude) to enable natural language interaction.
- Bridge Layer: A local, generic MCP server that acts as a schema translator. Unlike standard MCP servers that require custom Python wrappers for each tool, the bridge layer automatically parses CWL definitions and exposes the CWL-described command-line tools as executable MCP utilities.
- Execution Layer: A standard CWL runner that executes the underlying binaries within containerized environments (Docker). This ensures that analyses are reproducible and isolated from the host system's dependencies.
Quick Start
- Initialize: Create a local MCP server instance using
mcp_api(). - Register: Load your domain-specific tools described in CWL via
add_tool()(supports local files or repositories). - Serve: Start the MCP server using
mcp.serve().
The Workflow
- Interact: The user sends a natural language query to the MCP Client (e.g., Claude Desktop).
- Discover & Select: The Client retrieves the tool list from the MCP server. The LLM selects the appropriate tool and sends a structured request for the analysis.
- Execute: Coala translates this selection into a CWL job and executes it within a container (Docker), ensuring reproducibility.
- Respond: The execution logs and results are returned to the LLM, which interprets them and presents the final answer to the user.
Get Started
Requirements
- Python 3.12 or later
- FastAPI
- Requests
- Pydantic
- Uvicorn
- cwltool
- mcp (Model Context Protocol SDK)
Installation
To install coala-cli, run the following command:
pip install coala-cli
Use Cases
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coala_cli-0.2.0.tar.gz.
File metadata
- Download URL: coala_cli-0.2.0.tar.gz
- Upload date:
- Size: 13.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e967e6f75ba678b02955d31624e107d65cda04aef724a6b2bd49d7558c8061c7
|
|
| MD5 |
a32357028414b4c9c246385501faaf62
|
|
| BLAKE2b-256 |
ade521c5a0b393c683c0dd1d640517e90905096453d8ce89bcb7b78aca48ac5f
|
File details
Details for the file coala_cli-0.2.0-py3-none-any.whl.
File metadata
- Download URL: coala_cli-0.2.0-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5d1388537f7647fc742d24c24ea7f7847a96003dd905835cec4dcc0d1fddfc4
|
|
| MD5 |
79761af019bc75b8c2b91c0fc3880ff4
|
|
| BLAKE2b-256 |
12e82b7d4fa32af67b1eba1e64961907ef555e39579b3ca2387805d3b0bff551
|