An MCP server for Altair Graph Studio (AGS) providing intelligent SPARQL query capabilities via the Model Context Protocol.
Project description
RMGS MCP Server
An MCP (Model Context Protocol) server for Altair Graph Studio (AGS) that provides intelligent SPARQL query capabilities, ontology management, and graphmart construction tools — all accessible from AI coding assistants like GitHub Copilot, Claude Desktop, and other MCP-compatible clients.
Features
- SPARQL Query Execution — Run SPARQL queries against AGS graphmarts
- Knowledge Discovery — Explore ontologies, classes, and properties
- Ontology Management — Create, modify, and delete ontologies
- Graphmart Construction — Build and manage transformation layers and steps
- Agent Memory — Persistent and ephemeral memory for context retention
- Dual Modes —
explore(read-only, 8 tools) andcreate(all 40+ tools) - Multiple Transports — stdio, SSE, and streamable-http
Installation
From PyPI
pip install rmgs-mcp-server
From Source
git clone git@code.siemens.com:boris.shalumov/rmgs-mcp-server.git
cd rmgs-mcp-server
pip install -e .
Development Install
pip install -e ".[dev]"
Configuration
The server requires the following environment variables:
| Variable | Description | Required |
|---|---|---|
ANZO_SERVER |
AGS server hostname | Yes |
ANZO_PORT |
AGS server port | Yes |
ANZO_USERNAME |
Anzo username | Yes |
ANZO_PASSWORD |
Anzo password | Yes |
GRAPHMART_URI |
Graphmart URI | Yes |
API_KEY |
OpenAI API key | Yes |
ENABLE_AGENT_DEBUG |
Enable agent debug logs (true/false) |
No |
ENABLE_LOGGING_DEBUG |
Enable verbose logging (true/false) |
No |
You can also pass a JSON/JSONC config file with --config:
rmgs-mcp-server --config config/connection.json
Example config file (config/connection.json):
{
"anzo_server": "your-ags-server",
"anzo_port": 443,
"anzo_username": "user",
"anzo_password": "password",
"graphmart_uri": "http://cambridgesemantics.com/Graphmart/your-id",
"api_key": "sk-...",
"mcp_transport": "stdio"
}
Usage
Run as CLI
# stdio transport (default)
rmgs-mcp-server
# With config file
rmgs-mcp-server --config config/connection.json
# SSE transport
rmgs-mcp-server --transport sse --port 8000
# Streamable HTTP
rmgs-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000
# Explore mode (read-only tools only)
rmgs-mcp-server --mode explore
# Create mode (all tools)
rmgs-mcp-server --mode create
Run as Python module
python -m rmgs_mcp_server.ags_sparql_agent --config config/connection.json
VS Code Integration
Option 1: Install from PyPI (Recommended)
-
Install the package in your Python environment:
pip install rmgs-mcp-server
-
Add the MCP server to your VS Code settings. Create or edit
.vscode/mcp.jsonin your workspace:{ "servers": { "rmgs-mcp-server": { "type": "stdio", "command": "rmgs-mcp-server", "args": ["--config", "/path/to/your/connection.json"] } } }
Or with inline environment variables:
{ "servers": { "rmgs-mcp-server": { "type": "stdio", "command": "rmgs-mcp-server", "args": [], "env": { "ANZO_SERVER": "your-ags-server", "ANZO_PORT": "443", "ANZO_USERNAME": "your-username", "ANZO_PASSWORD": "your-password", "GRAPHMART_URI": "http://cambridgesemantics.com/Graphmart/your-id", "API_KEY": "your-openai-api-key" } } } }
-
Reload VS Code. The MCP server will appear in your Copilot Chat tool list.
Option 2: Run from Source
-
Clone and install in editable mode:
git clone git@code.siemens.com:boris.shalumov/rmgs-mcp-server.git cd rmgs-mcp-server pip install -e .
-
Use the same
.vscode/mcp.jsonconfiguration as Option 1.
Option 3: Direct Python Execution (No Install)
If you prefer not to install the package, point directly to the script:
{
"servers": {
"rmgs-mcp-server": {
"type": "stdio",
"command": "python",
"args": ["/absolute/path/to/rmgs-mcp-server/rmgs_mcp_server/ags_sparql_agent.py"],
"env": {
"PYTHONPATH": "/absolute/path/to/rmgs-mcp-server",
"ANZO_SERVER": "your-ags-server",
"ANZO_PORT": "443",
"ANZO_USERNAME": "your-username",
"ANZO_PASSWORD": "your-password",
"GRAPHMART_URI": "http://cambridgesemantics.com/Graphmart/your-id",
"API_KEY": "your-openai-api-key"
}
}
}
}
Using with VS Code Copilot Chat
Once configured, the MCP tools are available in GitHub Copilot Chat (Agent mode). You can:
- Ask questions about your knowledge graph data
- Discover ontologies and their structure
- Execute SPARQL queries
- Create and manage graphmart transformation layers
- Build and modify ontologies
Example prompts:
- "What ontologies are available in this graphmart?"
- "Show me the classes in the automotive ontology"
- "Run a SPARQL query to find all vehicles with more than 200 horsepower"
- "Create a new transformation layer for data enrichment"
Claude Desktop Integration
Add to ~/.config/claude-desktop/claude_desktop_config.json:
{
"mcpServers": {
"rmgs-mcp-server": {
"command": "rmgs-mcp-server",
"args": ["--config", "/path/to/connection.json"],
"env": {}
}
}
}
Tool Categories
System & Monitoring
| Tool | Description |
|---|---|
test_system_connection |
Test MCP server and AGS agent status |
get_session_logs |
Get session logs and interaction history |
SPARQL Query Execution
| Tool | Description |
|---|---|
execute_sparql_query |
Execute SPARQL directly against graphmart |
query_ags_configuration |
Query graphmart metadata (local volume) |
update_ags_configuration |
Update graphmart metadata with SPARQL |
Knowledge Discovery
| Tool | Description |
|---|---|
discover_knowledge_overview |
Get overview of available knowledge |
discover_available_ontologies |
List all available ontologies |
discover_ontology_classes |
List classes in a specific ontology |
discover_class_data_properties |
List data properties for a class |
discover_class_object_properties |
List object properties for a class |
Ontology Management
| Tool | Description |
|---|---|
create_ontology |
Create a new ontology |
delete_ontology |
Delete an ontology |
register_ontology |
Register ontologies |
load_ontology_from_file |
Load TTL files into named graphs |
add_ontology_class |
Add a class to an ontology |
remove_ontology_class |
Remove a class from an ontology |
add_ontology_property |
Add a property to an ontology |
remove_ontology_property |
Remove a property from an ontology |
add_ontology_import |
Add an import to an ontology |
remove_ontology_import |
Remove an import from an ontology |
list_ontology_structure_classes |
List classes in ontology structure |
list_ontology_structure_properties |
List properties in ontology structure |
get_ontology_cache_status |
Get ontology cache status |
clear_ontology_cache |
Clear ontology caches |
refresh_ontology_cache |
Force cache refresh |
Graphmart Construction
| Tool | Description |
|---|---|
create_transformation_layer |
Create transformation layers |
update_transformation_layer |
Update layer properties |
delete_transformation_layer |
Delete transformation layers |
list_transformation_layers |
List all transformation layers |
add_transformation_step |
Add transformation steps to layers |
update_transformation_step |
Update transformation step properties |
delete_transformation_step |
Delete transformation steps |
list_transformation_steps |
List steps within a layer |
add_direct_load_step |
Add direct data loading steps |
update_direct_load_step |
Update direct load step properties |
refresh_graphmart |
Lightweight refresh of changed layers |
reload_graphmart |
Complete reprocessing of all layers |
get_layer_status |
Comprehensive layer and step error info |
get_step_status |
Specific step debugging |
Agent Memory
| Tool | Description |
|---|---|
initialize_agent_memory |
Initialize memory for the agent |
write_permanent_memory |
Write to persistent memory |
write_ephemeral_memory |
Write to session-scoped memory |
promote_ephemeral_memory |
Promote ephemeral to permanent memory |
clear_agent_memory |
Clear agent memory |
read_agent_memory |
Read from agent memory |
Publishing to PyPI
Build
pip install build twine
python -m build
This creates dist/rmgs_mcp_server-<version>.tar.gz and dist/rmgs_mcp_server-<version>-py3-none-any.whl.
Upload to PyPI
# Test PyPI (recommended first)
twine upload --repository testpypi dist/*
# Production PyPI
twine upload dist/*
Bump Version
Update the version in both:
pyproject.toml→version = "x.y.z"rmgs_mcp_server/__init__.py→__version__ = "x.y.z"
Development
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Build package
python -m build
Project Structure
rmgs-mcp-server/
├── rmgs_mcp_server/ # Main package
│ ├── __init__.py # Package init with version
│ ├── ags_sparql_agent.py # MCP server entry point
│ ├── models.py # Pydantic/dataclass models
│ ├── sparql_agent_core.py # Core SPARQL agent logic
│ ├── sparql_query_engine.py # SPARQL query engine
│ ├── ontology_cache.py # Ontology caching
│ ├── ontology_discovery.py # Ontology discovery
│ ├── interaction_logger.py # Logging utilities
│ ├── tools/ # MCP tool implementations
│ │ ├── base_tool.py # Base tool class
│ │ ├── system/ # System & monitoring tools
│ │ ├── query/ # SPARQL query tools
│ │ ├── discovery/ # Knowledge discovery tools
│ │ ├── ontology/ # Ontology management tools
│ │ ├── graphmart/ # Graphmart construction tools
│ │ └── memory/ # Agent memory tools
│ └── utils/ # Shared utilities
├── config/ # Configuration templates
├── prompts/ # Agent prompt templates
├── skills/ # Best practices guides
├── pyproject.toml # Package metadata & build config
├── LICENSE # MIT License
└── README.md # This file
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file siemens_graph_studio_mcp_server-0.5.0.tar.gz.
File metadata
- Download URL: siemens_graph_studio_mcp_server-0.5.0.tar.gz
- Upload date:
- Size: 129.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
da1961f4708f6c3a258754ff05a579abe36e87fddb7246407d012db2363b504b
|
|
| MD5 |
96dcdf6bca2667c0515b459a7197562a
|
|
| BLAKE2b-256 |
9eeccc57b8a2430392e41f24aa3581524adf2186dc563d2dca674c1aac9ec116
|
File details
Details for the file siemens_graph_studio_mcp_server-0.5.0-py3-none-any.whl.
File metadata
- Download URL: siemens_graph_studio_mcp_server-0.5.0-py3-none-any.whl
- Upload date:
- Size: 153.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
446ed2ee56fe31217e39e638e6fe76cfcef24b2dd37ad5074ae1f0cdfaee5cb1
|
|
| MD5 |
afe270f93781e5ba1ed8d2d88525c370
|
|
| BLAKE2b-256 |
ee074237108f38f48f0367b5191954535f0b638a92a2b24f91dee40e1595756c
|