Generate MCP services from natural language descriptions
Project description
Text2MCP
A powerful toolkit for generating MCP (Modular Communication Protocol) services from natural language descriptions.
Key Features
- Generate complete MCP service code from natural language descriptions
- Support for OpenAI and compatible LLM providers with the OpenAI API interface
- One-click deployment and launch of generated MCP services
- Integrated dependency management with package installation support
- Custom template creation and reuse for consistent code generation
- Full command-line interface and Python API for all functionality
- Integrated dependency management with uv for efficient package installation
- Complete lifecycle management for MCP services
Installation
pip install text2mcp
Quick Start
Configuration
First, set up your environment variables:
# Set environment variables
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_MODEL="gpt-3.5-turbo" # Optional
export OPENAI_BASE_URL="https://api.example.com/v1" # Optional, for compatible APIs
Or pass parameters directly in code:
from text2mcp import CodeGenerator
# Pass configuration directly
generator = CodeGenerator(
api_key="your_api_key_here",
model="gpt-3.5-turbo",
base_url="https://api.example.com/v1" # Optional, for compatible APIs
)
Or configure via command line:
text2mcp config --api-key "your_api_key" --model "gpt-4" --base-url "https://api.example.com/v1"
Tip: Text2MCP supports any LLM service with an OpenAI-compatible interface. See the Using Third-Party OpenAI Compatible APIs section for details.
Generating MCP Services
Using Python API
from text2mcp import CodeGenerator
# Initialize code generator
generator = CodeGenerator()
# Generate service
code = generator.generate("Create a calculator service that supports addition, subtraction, multiplication, and division")
# Save to file
generator.save_to_file(code, "calculator_service.py")
Using Custom Templates
from text2mcp import CodeGenerator
# Initialize code generator
generator = CodeGenerator()
# Generate service using custom template
code = generator.generate(
"Create a database query service that supports CRUD operations",
template_file="my_template.md"
)
# Save to specified directory
generator.save_to_file(code, "db_service.py", directory="./services")
Running MCP Services
from text2mcp import ServiceRunner
# Initialize service runner
runner = ServiceRunner()
# Start service
runner.start_service("calculator_service.py")
Installing Dependencies
import asyncio
from text2mcp import PackageInstaller
async def install_deps():
# Install a single package
await PackageInstaller.install(package="requests")
# Install multiple packages
await PackageInstaller.install(packages=["numpy", "pandas"])
# Install from requirements file
await PackageInstaller.install(requirements="requirements.txt")
# Run installation
asyncio.run(install_deps())
Command-Line Usage
Generating Services
# Basic usage
text2mcp generate "Create a calculator service that supports addition, subtraction, multiplication, and division" --output calculator_service.py
# Specify output directory
text2mcp generate "Create a weather query service" --output weather_service.py --directory ./services
# Use custom template
text2mcp generate "Create a data processing service" --template my_template.md --output data_service.py
# Use custom config file
text2mcp generate "Create a file conversion service" --config my_config.toml --output converter_service.py
Running Services
# Run service
text2mcp run calculator_service.py
# Specify host and port
text2mcp run calculator_service.py --host 0.0.0.0 --port 8080
# Use uv for enhanced performance
text2mcp run calculator_service.py --use-uv
# Run service in the background
text2mcp run calculator_service.py --daemon
Dependency Management
# Install a single package
text2mcp install requests
# Install multiple packages
text2mcp install numpy pandas matplotlib
# Install from requirements file
text2mcp install --requirements requirements.txt
# Create requirements file
text2mcp install --create-requirements --packages numpy,pandas,matplotlib
Configuration Management
# Set API key
text2mcp config --api-key "your_api_key"
# Set model
text2mcp config --model "gpt-4"
# Set custom API endpoint
text2mcp config --base-url "https://api.example.com/v1"
# Display current configuration
text2mcp config --show
# Reset configuration
text2mcp config --reset
Other Commands
# Check version
text2mcp --version
# View help
text2mcp --help
text2mcp generate --help
Advanced Usage
Custom Templates
You can create your own MCP service templates to ensure generated code follows your project structure and requirements. Templates can be Python files or Markdown files.
Markdown Templates (Recommended)
Markdown templates provide a more user-friendly format for creating templates, allowing you to include code, documentation, and configuration information in a single file.
---
service_name: my_service
description: My custom service
author: Your Name
version: 1.0.0
---
# My MCP Service Template
## Import Section
```python
import argparse
import logging
from mcp.server import FastMCP
# Other imports...
Service Initialization
# Create MCP service
mcp = FastMCP("custom_service")
# Logging configuration
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
MCP Tool Definition
@mcp.tool()
async def example_tool(param: str):
"""
Example tool function
:param param: Parameter description
:return: Return value description
"""
# Implementation code...
return result
Main Function
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--host", default="127.0.0.1")
parser.add_argument("--port", type=int, default=8000)
args = parser.parse_args()
# Start service...
When using Markdown templates, the system will automatically:
1. Extract YAML front matter metadata (if present)
2. Recognize all Python code blocks and combine them in order
3. Preserve heading structure as code comments
4. Prioritize "Import" related sections
### Using Templates
Whether using Python or Markdown templates, the usage is the same:
```bash
# Using Markdown template
text2mcp generate "Create a data processing service" --template my_template.md --output data_service.py
You can also omit the extension, and the system will automatically find the corresponding file:
text2mcp generate "Create a data processing service" --template my_template --output data_service.py
Using Third-Party OpenAI Compatible APIs
Text2MCP supports any LLM service that implements the OpenAI API interface specification. Here's an example using a third-party API:
Configuration Validation
First, create a config.toml file:
[tool.llm]
api_key = "your-api-key-here"
base_url = "https://api.third-party-provider.com/v1"
model = "third-party-model-name"
Then write a simple script to verify the configuration is loaded correctly:
from text2mcp.utils.config import load_config
# Load from environment variables
config1 = load_config()
print("Config from environment:", config1)
# Load from config file
config2 = load_config("config.toml")
print("Config from file:", config2)
Code Generation Example
Using a third-party API to generate MCP service code:
from text2mcp import CodeGenerator
import os
# Create code generator with third-party provider config
generator = CodeGenerator(
api_key="your-api-key-here",
base_url="https://api.third-party-provider.com/v1",
model="third-party-model-name"
)
# Generate a simple calculator service
service_description = """
Create a simple calculator service that supports the four basic operations:
addition, subtraction, multiplication, and division.
Each operation should be a separate API endpoint.
"""
# Define output directory
output_dir = "generated/calculator_service"
os.makedirs(output_dir, exist_ok=True)
# Generate code
generated_code = generator.generate(service_description, template_file="example.md")
if generated_code:
# Save to file
file_path = generator.save_to_file(generated_code, "calculator_service.py", directory=output_dir)
if file_path:
print(f"Service code generated and saved to: {file_path}")
else:
print("Error saving code to file")
else:
print("Code generation failed")
Note: When using third-party APIs, ensure the
modelparameter matches the model name supported by that API provider.
Integration with Custom Applications
You can integrate Text2MCP into your applications to provide dynamic MCP service generation capabilities:
from text2mcp import CodeGenerator, ServiceRunner
import asyncio
async def dynamic_service_creation(description, service_name):
"""Dynamically create and run an MCP service"""
# Generate code
generator = CodeGenerator()
code = generator.generate(description)
if code:
# Save to file
path = generator.save_to_file(code, f"{service_name}.py", "./services")
# Start service
runner = ServiceRunner()
result = await runner.start_service(path)
return {"status": "success", "service_path": path, "result": result}
else:
return {"status": "error", "message": "Code generation failed"}
# Usage example
asyncio.run(dynamic_service_creation(
"Create a file processing service that supports reading, writing, and modifying text files.",
"file_processor"
))
FAQ
1. Why isn't my API key working?
Ensure your API key is in the correct format and has sufficient permissions and quota. OpenAI API keys typically start with "sk-".
2. How do I use a custom LLM provider?
As long as the LLM provider offers an OpenAI-compatible interface, you can use it by setting the base_url parameter:
text2mcp config --base-url "https://your-provider.com/v1"
For common third-party providers, here are some configuration examples:
# Command line configuration
text2mcp config --api-key "your-provider-key" --base-url "https://api.provider.com/v1" --model "provider-model-name"
# Or using environment variables
export OPENAI_API_KEY="your-provider-key"
export OPENAI_BASE_URL="https://api.provider.com/v1"
export OPENAI_MODEL="provider-model-name"
# Or passing parameters directly in code
generator = CodeGenerator(
api_key="your-provider-key",
base_url="https://api.provider.com/v1",
model="provider-model-name"
)
3. What if the generated code quality is poor?
Try providing more detailed descriptions or use a more advanced model like GPT-4. You can also improve code quality by creating custom templates.
4. Why are dependency installations failing?
If using uv for dependency installation fails, ensure uv is installed:
pip install uv
Or add the --no-uv parameter to use standard pip:
text2mcp install requests --no-uv
Contributing
Contributions are welcome! Please see the contribution guidelines for how to participate in the project's development.
License
This project is licensed under the MIT License. See the LICENSE file for details. 11
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file text2mcp-0.2.8.tar.gz.
File metadata
- Download URL: text2mcp-0.2.8.tar.gz
- Upload date:
- Size: 40.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
29437597bf19161714f61951ff1881c586046b92104044b7f07f3908f191083d
|
|
| MD5 |
907c0bf546d88235aec2afa53ce76f3b
|
|
| BLAKE2b-256 |
2dff75b624a787b6a2d88cb41a242daef6a40bbbda593a95b5310e08003fb932
|
File details
Details for the file text2mcp-0.2.8-py3-none-any.whl.
File metadata
- Download URL: text2mcp-0.2.8-py3-none-any.whl
- Upload date:
- Size: 39.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7934a63a08c6ee194b219930837a1f1138a3d30ab1249fa86bd10a10fe274eda
|
|
| MD5 |
41b4f5d8b4a30b13320a5ba30c5e49ff
|
|
| BLAKE2b-256 |
303a1c809996e9562257a3460fc9a7384eb5202ba7a6a45fbe8881cf2374ece6
|