Skip to main content

An MCP server for Altair Graph Studio (AGS) providing intelligent SPARQL query capabilities via the Model Context Protocol.

Project description

RMGS MCP Server

An MCP (Model Context Protocol) server for Altair Graph Studio (AGS) that provides intelligent SPARQL query capabilities, ontology management, and graphmart construction tools — all accessible from AI coding assistants like GitHub Copilot, Claude Desktop, and other MCP-compatible clients.

Features

  • SPARQL Query Execution — Run SPARQL queries against AGS graphmarts
  • Knowledge Discovery — Explore ontologies, classes, and properties
  • Ontology Management — Create, modify, and delete ontologies
  • Graphmart Construction — Build and manage transformation layers and steps
  • Agent Memory — Persistent and ephemeral memory for context retention
  • Dual Modesexplore (read-only, 12 tools) and create (all 48 tools)
  • Multiple Transports — stdio, SSE, and streamable-http

Installation

From PyPI

pip install siemens-graph-studio-mcp-server

Note: Installing this package will also install rdflib, which adds rdfpipe and rdfgraphisomorphism executables to your environment. These are standard rdflib CLI utilities and are not part of this MCP server — they can be safely ignored.

From Source

git clone git@code.siemens.com:boris.shalumov/rmgs-mcp-server.git
cd rmgs-mcp-server
pip install -e .

Development Install

pip install -e ".[dev]"

Configuration

The recommended approach is a JSON config file passed via --config. This supports multiple servers, ${ENV_VAR} references for secrets, default server/graphmart selection, and runtime switching — no restart required.

{
  "servers": {
    "production": {
      "host": "your-ags-server.example.com",
      "port": 443,
      "username": "sysadmin",
      "password": "${PROD_PASSWORD}",
      "graphmart_uri": "http://cambridgesemantics.com/Graphmart/abc123",
      "default": true
    },
    "demo": {
      "host": "demo.example.com",
      "port": 8443,
      "username": "sysadmin",
      "password": "${DEMO_PASSWORD}"
    }
  },
  "agent_config": {
    "max_iterations": 3,
    "query_timeout": 30,
    "cache_ontologies": true,
    "ontology_cache_ttl": 86400
  },
  "enable_agent_debug": "false",
  "enable_logging_debug": "false"
}

Secrets referenced as ${ENV_VAR} are resolved from the environment at startup. Pass them via your shell or the MCP client's env block — never commit passwords directly.

The server marked "default": true (and its graphmart_uri if set) is automatically selected at startup. Additional servers can be switched at runtime using the select_server and select_graphmart tools.

Legacy: Direct ANZO_* environment variables (no config file) are still supported for backward compatibility but deprecated. The config file approach is strongly preferred.

Usage

Run as CLI

# stdio transport (default)
siemens-graph-studio-mcp-server

# With config file
siemens-graph-studio-mcp-server --config config/connection.json

# SSE transport
siemens-graph-studio-mcp-server --transport sse --port 8000

# Streamable HTTP
siemens-graph-studio-mcp-server --transport streamable-http --host 0.0.0.0 --port 8000

# Explore mode (read-only tools only)
siemens-graph-studio-mcp-server --mode explore

# Create mode (all tools)
siemens-graph-studio-mcp-server --mode create

Run as Python module

python -m rmgs_mcp_server.ags_sparql_agent --config config/connection.json

VS Code / GitHub Copilot Integration

Create or edit .vscode/mcp.json in your workspace. Use the env block to inject secrets referenced by ${ENV_VAR} in your config file:

{
  "servers": {
    "ags-sparql-agent": {
      "type": "stdio",
      "command": "siemens-graph-studio-mcp-server",
      "args": ["--config", "/path/to/config/config.json"],
      "env": {
        "PROD_PASSWORD": "your-password-here",
        "DEMO_PASSWORD": "your-other-password-here"
      }
    }
  }
}

Or run directly from source without installing the package:

{
  "servers": {
    "ags-sparql-agent": {
      "type": "stdio",
      "command": "python",
      "args": [
        "/absolute/path/to/kg_agent_mcp/rmgs_mcp_server/ags_sparql_agent.py",
        "--config", "/absolute/path/to/kg_agent_mcp/config/config.json"
      ],
      "env": {
        "PROD_PASSWORD": "your-password-here"
      }
    }
  }
}

Reload VS Code after saving. The MCP server will appear in your Copilot Chat tool list.

Using with VS Code Copilot Chat

Once configured, the MCP tools are available in GitHub Copilot Chat (Agent mode). You can:

  • Ask questions about your knowledge graph data
  • Discover ontologies and their structure
  • Execute SPARQL queries
  • Create and manage graphmart transformation layers
  • Build and modify ontologies

Example prompts:

  • "What ontologies are available in this graphmart?"
  • "Show me the classes in the automotive ontology"
  • "Run a SPARQL query to find all vehicles with more than 200 horsepower"
  • "Create a new transformation layer for data enrichment"

Claude Code Integration

Add to .claude.json in your home directory or project root:

{
  "mcpServers": {
    "ags-sparql-agent": {
      "command": "siemens-graph-studio-mcp-server",
      "args": ["--config", "/path/to/config/config.json"],
      "env": {
        "PROD_PASSWORD": "your-password-here"
      }
    }
  }
}

Tool Categories

System & Monitoring

Tool Description
test_system_connection Test MCP server and AGS agent status
get_session_logs Get session logs and interaction history
list_servers List all configured AGS servers
select_server Switch to a different AGS server at runtime
list_graphmarts List all graphmarts on the active server
select_graphmart Switch to a different graphmart at runtime

SPARQL Query Execution

Tool Description
execute_sparql_query Execute SPARQL directly against graphmart
query_ags_configuration Query graphmart metadata (local volume)
update_ags_configuration Update graphmart metadata with SPARQL

Knowledge Discovery

Tool Description
discover_knowledge_overview Get overview of available knowledge
discover_available_ontologies List all available ontologies
discover_ontology_classes List classes in a specific ontology
discover_class_data_properties List data properties for a class
discover_class_object_properties List object properties for a class

Ontology Management

Tool Description
create_ontology Create a new ontology
delete_ontology Delete an ontology
register_ontology Register ontologies
load_ontology_from_file Load TTL files into named graphs
add_ontology_class Add a class to an ontology
remove_ontology_class Remove a class from an ontology
add_ontology_property Add a property to an ontology
remove_ontology_property Remove a property from an ontology
add_ontology_import Add an import to an ontology
remove_ontology_import Remove an import from an ontology
list_ontology_structure_classes List classes in ontology structure
list_ontology_structure_properties List properties in ontology structure
get_ontology_cache_status Get ontology cache status
clear_ontology_cache Clear ontology caches
refresh_ontology_cache Force cache refresh

Graphmart Construction

Tool Description
create_transformation_layer Create transformation layers
update_transformation_layer Update layer properties
delete_transformation_layer Delete transformation layers
list_transformation_layers List all transformation layers
add_transformation_step Add transformation steps to layers
update_transformation_step Update transformation step properties
delete_transformation_step Delete transformation steps
list_transformation_steps List steps within a layer
add_direct_load_step Add direct data loading steps
update_direct_load_step Update direct load step properties
refresh_graphmart Lightweight refresh of changed layers
reload_graphmart Complete reprocessing of all layers
get_layer_status Comprehensive layer and step error info
get_step_status Specific step debugging

Agent Memory

Tool Description
initialize_agent_memory Initialize memory for the agent
write_permanent_memory Write to persistent memory
write_ephemeral_memory Write to session-scoped memory
promote_ephemeral_memory Promote ephemeral to permanent memory
clear_agent_memory Clear agent memory
read_agent_memory Read from agent memory

Publishing to PyPI

Build

pip install build twine
python -m build

This creates dist/rmgs_mcp_server-<version>.tar.gz and dist/rmgs_mcp_server-<version>-py3-none-any.whl.

Upload to PyPI

# Test PyPI (recommended first)
twine upload --repository testpypi dist/*

# Production PyPI
twine upload dist/*

Bump Version

Update the version in both:

  • pyproject.tomlversion = "x.y.z"
  • rmgs_mcp_server/__init__.py__version__ = "x.y.z"

Development

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest

# Build package
python -m build

Project Structure

rmgs-mcp-server/
├── rmgs_mcp_server/           # Main package
│   ├── __init__.py            # Package init with version
│   ├── ags_sparql_agent.py    # MCP server entry point
│   ├── models.py              # Pydantic/dataclass models
│   ├── server_registry.py     # Multi-server configuration manager
│   ├── sparql_agent_core.py   # Core SPARQL agent logic
│   ├── sparql_query_engine.py # SPARQL query engine
│   ├── ontology_cache.py      # Ontology caching
│   ├── ontology_discovery.py  # Ontology discovery
│   ├── interaction_logger.py  # Logging utilities
│   ├── tools/                 # MCP tool implementations
│   │   ├── base_tool.py       # Base tool class
│   │   ├── system/            # System, server & graphmart tools
│   │   ├── query/             # SPARQL query tools
│   │   ├── discovery/         # Knowledge discovery tools
│   │   ├── ontology/          # Ontology management tools
│   │   ├── graphmart/         # Graphmart construction tools
│   │   └── memory/            # Agent memory tools
│   └── utils/                 # Shared utilities
├── config/                    # Configuration templates & examples
├── prompts/                   # Agent prompt templates
├── skills/                    # Best practices guides
├── pyproject.toml             # Package metadata & build config
├── LICENSE                    # MIT License
└── README.md                  # This file

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

siemens_graph_studio_mcp_server-0.5.3.tar.gz (90.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

siemens_graph_studio_mcp_server-0.5.3-py3-none-any.whl (150.8 kB view details)

Uploaded Python 3

File details

Details for the file siemens_graph_studio_mcp_server-0.5.3.tar.gz.

File metadata

File hashes

Hashes for siemens_graph_studio_mcp_server-0.5.3.tar.gz
Algorithm Hash digest
SHA256 e8807eb8339e2f85184c2ee9de4dd8d2d0581c54cf64e80fe1b2bf3ca41047bb
MD5 f8833fea93268912287d744c9756dc5f
BLAKE2b-256 4bc389cac48c7e8fa6d725f6dcc996d91d781294530d8c74eb1f17330be36b61

See more details on using hashes here.

File details

Details for the file siemens_graph_studio_mcp_server-0.5.3-py3-none-any.whl.

File metadata

File hashes

Hashes for siemens_graph_studio_mcp_server-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 fef77ee577e4b78b1c53bb640a616d9cf4bc681599379150c873791de39dc9bc
MD5 4521491ae16796d8fd067c6960870fb8
BLAKE2b-256 050dd55b35301521f9f07027b95d63d8d54174e0e786ff5b5485747b900c44b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page