Skip to main content

Model Context Protocol server for Ceph storage cluster management

Project description

Ceph MCP Server

A Model Context Protocol (MCP) server that enables AI assistants to interact with Ceph storage clusters through natural language. This server provides a bridge between AI tools and your Ceph infrastructure, making storage management more accessible and intuitive.

๐Ÿš€ Features

  • Health Monitoring: Get comprehensive cluster health status and diagnostics
  • Host Management: Monitor and manage cluster hosts and their services
  • Detailed Analysis: Access detailed health checks for troubleshooting
  • Secure Communication: Authenticated access to Ceph Manager API
  • Structured Responses: AI-friendly output formatting for clear communication
  • Async Architecture: Non-blocking operations for better performance

๐Ÿ“‹ Prerequisites

  • Python 3.11 or higher
  • UV package manager
  • Access to a Ceph cluster with Manager API enabled
  • Valid Ceph credentials with appropriate permissions

๐Ÿ› ๏ธ Installation

  1. Clone and setup the project:
# Create the project directory
mkdir ceph-mcp-server
cd ceph-mcp-server

# Initialize UV project
uv init --python 3.11

# Add dependencies
uv add mcp httpx pydantic python-dotenv structlog asyncio-mqtt
uv add --dev pytest pytest-asyncio black isort mypy ruff
  1. Set up your environment:
# Copy the example environment file
cp .env.example .env

# Edit .env with your Ceph cluster details
nano .env
  1. Configure your Ceph connection:
# .env file contents
CEPH_MANAGER_URL=https://192.16.0.31:8443
CEPH_USERNAME=admin
CEPH_PASSWORD=your_ceph_password
CEPH_SSL_VERIFY=false  # Set to true in production with proper certificates

๐Ÿƒโ€โ™‚๏ธ Quick Start

  1. Start the MCP server:
uv run python -m ceph_mcp.server
  1. Test the connection: The server will log its startup and any connection issues. Look for messages indicating successful connection to your Ceph cluster.

๐Ÿ”ง Configuration

Environment Variables

Variable Description Default Required
CEPH_MANAGER_URL Ceph Manager API endpoint https://192.16.0.31:8443 Yes
CEPH_USERNAME Ceph username for API access admin Yes
CEPH_PASSWORD Ceph password for authentication - Yes
CEPH_SSL_VERIFY Enable SSL certificate verification true No
CEPH_CERT_PATH Path to custom SSL certificate - No
LOG_LEVEL Logging level (DEBUG, INFO, WARNING, ERROR) INFO No
MAX_REQUESTS_PER_MINUTE Rate limiting for API requests 60 No

Security Considerations

  • Production Usage: Always enable SSL verification (CEPH_SSL_VERIFY=true) in production
  • Credentials: Store credentials securely and never commit them to version control
  • Network Access: Ensure the MCP server can reach your Ceph Manager API endpoint
  • Permissions: Use a dedicated Ceph user with minimal required permissions

๐ŸŽฏ Available Tools

The MCP server provides four main tools for AI assistants:

1. get_cluster_health

Get comprehensive cluster health status including overall health, warnings, and statistics.

Use cases:

  • "How is my Ceph cluster doing?"
  • "Are there any storage issues I should know about?"
  • "What's the current status of my cluster?"

2. get_host_status

Retrieve information about all hosts in the cluster including online/offline status and service distribution.

Use cases:

  • "Which hosts are online in my cluster?"
  • "What services are running on each host?"
  • "Are any hosts having problems?"

3. get_health_details

Get detailed health check information for troubleshooting specific issues.

Use cases:

  • "What specific warnings does my cluster have?"
  • "Give me detailed information about cluster errors"
  • "Help me troubleshoot this storage issue"

4. get_host_details

Get comprehensive information about a specific host.

Parameters:

  • hostname: The name of the host to examine

Use cases:

  • "Tell me about host ceph-node-01"
  • "What services are running on this specific host?"
  • "Get detailed specs for this host"

๐Ÿ“Š Example Interactions

Health Check

AI Assistant: "How is my Ceph cluster doing?"

Response: โœ… Cluster is healthy. All 3 hosts are online. OSDs: 12/12 up.
๐ŸŸข Overall Status: HEALTH_OK
๐Ÿ–ฅ๏ธ  Hosts: 3/3 online
๐Ÿ’พ OSDs: 12/12 up

Troubleshooting

AI Assistant: "What warnings does my cluster have?"

Response: ๐ŸŸก Cluster has 2 warning(s) requiring attention.
๐ŸŸก Warnings requiring attention:
   - OSD_NEARFULL: 1 osd(s) are getting full
   - POOL_BACKFILLFULL: 1 pool(s) are backfill full

๐Ÿงช Development

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=ceph_mcp

# Run specific test types
uv run pytest -m "not integration"  # Skip integration tests

Code Quality

# Format code
uv run black src/ tests/
uv run isort src/ tests/

# Lint code
uv run ruff check src/ tests/
uv run mypy src/

# All checks
uv run ruff check src/ tests/ && uv run mypy src/ && uv run pytest

Project Structure

ceph-mcp-server/
โ”œโ”€โ”€ src/ceph_mcp/
โ”‚   โ”œโ”€โ”€ __init__.py          # Package initialization
โ”‚   โ”œโ”€โ”€ server.py            # Main MCP server
โ”‚   โ”œโ”€โ”€ api/
โ”‚   โ”‚   โ””โ”€โ”€ ceph_client.py   # Ceph API client
โ”‚   โ”œโ”€โ”€ config/
โ”‚   โ”‚   โ””โ”€โ”€ settings.py      # Configuration management
โ”‚   โ”œโ”€โ”€ handlers/
โ”‚   โ”‚   โ””โ”€โ”€ health_handlers.py # Request handlers
โ”‚   โ”œโ”€โ”€ models/
โ”‚   โ”‚   โ””โ”€โ”€ ceph_models.py   # Data models
โ”‚   โ””โ”€โ”€ utils/               # Utility functions
โ”œโ”€โ”€ tests/                   # Test suite
โ”œโ”€โ”€ .env.example            # Environment template
โ”œโ”€โ”€ pyproject.toml          # Project configuration
โ””โ”€โ”€ README.md              # This file

๐Ÿ› Troubleshooting

Common Issues

  1. Connection Refused

    • Check if Ceph Manager is running and accessible
    • Verify the URL and port in your configuration
    • Ensure network connectivity between MCP server and Ceph cluster
  2. Authentication Failed

    • Verify username and password are correct
    • Check that the user has appropriate permissions
    • Ensure the Ceph user account is active
  3. SSL Certificate Errors

    • For development: Set CEPH_SSL_VERIFY=false
    • For production: Use proper SSL certificates or specify CEPH_CERT_PATH
  4. Permission Denied

    • Ensure the Ceph user has read permissions for health and host information
    • Check Ceph user capabilities: ceph auth get client.your-username

Debugging

Enable debug logging to get more detailed information:

LOG_LEVEL=DEBUG uv run python -m ceph_mcp.server

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Make your changes and add tests
  4. Run the test suite: uv run pytest
  5. Format code: uv run black src/ tests/
  6. Submit a pull request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

๐Ÿ“ž Support

  • Create an issue for bug reports or feature requests
  • Check existing issues before creating new ones
  • Provide detailed information about your environment when reporting issues

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0.tar.gz (41.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0.tar.gz.

File metadata

  • Download URL: iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0.tar.gz
  • Upload date:
  • Size: 41.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0b00dc1e9ef2055cbb8e085f34b4059041efabbed87b78fe5960602a3ca4ffce
MD5 60477570838ea61bb38c0e563c66cc68
BLAKE2b-256 1199fc5ed37c61afdbbb4494997460d0a470b37498fab4f592f424808e395632

See more details on using hashes here.

File details

Details for the file iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 60.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_rajmohanram_ceph_mcp_server-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e274bf067310a9e2610cec2ceddb45d7357f2904965e93736b1062aa12b51451
MD5 b0826cd308dbc9261f7328986cf545ea
BLAKE2b-256 5a1aa262094f34b8c0b8de06709ee6dd8c858f8204286f7ea9f2061bff4af2d0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page