Skip to main content

OpenSearch Solution Architect MCP server — guides you from requirements to a running search setup

Project description

This repo uses the strands framework to build an OpenSearch semantic search solution architect agent. The agent collects user requirements and gives recommendations for index types.

There are two ways to use the agent: as a standalone interactive CLI or via an MCP server that any MCP-compatible client can drive.

Standalone Agent

Start the interactive orchestrator in a terminal:

python opensearch_orchestrator/orchestrator.py

The orchestrator guides you through sample collection, requirements gathering, solution planning, and execution — all in one interactive session.

MCP Server (Cursor, Claude Desktop, etc.)

The MCP server exposes the same orchestrator workflow as a set of phase tools. A client LLM drives the conversation with the user and calls the tools in order.

Prerequisites

Install uv (one-time, no sudo needed):

curl -LsSf https://astral.sh/uv/install.sh | sh

Running manually

uv run opensearch_orchestrator/mcp_server.py

uv reads the inline script metadata in opensearch_orchestrator/mcp_server.py and auto-installs dependencies into a cached virtual environment.

Running from PyPI (uvx)

After publishing to PyPI, run the MCP server without cloning the repo:

uvx opensearch-orchestrator@latest

If you install via pip, you can also run:

opensearch-orchestrator

Important: this command starts a stdio MCP server (JSON-RPC), not an interactive CLI. It should be launched by an MCP client such as Cursor, Claude Desktop, or MCP Inspector. If you want an interactive terminal workflow, run:

python opensearch_orchestrator/orchestrator.py

MCP workflow tools

The server exposes high-level phase tools that mirror the standalone orchestrator workflow:

Tool Phase Description
load_sample 1 Load a sample document (built-in, file, URL, index, or paste)
set_preferences 2 Set budget, performance, query pattern, deployment preferences
start_planning 3 Start the planning agent; returns initial architecture proposal
refine_plan 3 Send user feedback to refine the proposal
finalize_plan 3 Finalize the plan when the user confirms
set_plan 3 Store a client-authored finalized plan (validated manual fallback)
set_plan_from_planning_complete 3 Parse/store a <planning_complete> planner response
execute_plan 4 Execute the plan (create index, models, pipelines, UI)
retry_execution 4 Resume from a failed execution step
cleanup_verification Post Remove test documents on user request

Low-level domain tools (create_index, submit_sample_doc, etc.) are also exposed for advanced use.

Planner backend in MCP mode:

  • Default: OPENSEARCH_MCP_PLANNER_MODE=client (uses MCP client sampling / client LLM).
  • Server mode: OPENSEARCH_MCP_PLANNER_MODE=server (uses server-side Bedrock planner).
  • Client fallback: if the MCP client does not support sampling/createMessage, start_planning returns manual_planning_required=true plus manual_planner_system_prompt and manual_planner_initial_input; run planner turns with the client LLM and call set_plan_from_planning_complete(planner_response).

Cursor integration

  1. Add the following to .cursor/mcp.json in your workspace (adjust cwd to the repo path):
{
  "mcpServers": {
    "opensearch-orchestrator": {
      "command": "uv",
      "args": ["run", "opensearch_orchestrator/mcp_server.py"],
      "cwd": "/path/to/agent-poc"
    }
  }
}
  1. Reload the Cursor window (Cmd+Shift+P → "Developer: Reload Window"), then enable the server in Cursor Settings → MCP.

  2. A Cursor rule at .cursor/rules/opensearch-workflow.mdc auto-activates when you ask about OpenSearch solution design and teaches the LLM the tool sequence.

If Cursor cannot find uv on its PATH, use the absolute path (e.g. ~/.local/bin/uv).

Claude Desktop integration

  1. Copy claude_desktop_config.example.json to your Claude Desktop config directory:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Edit the cwd path to point to this repo.

  3. Restart Claude Desktop. The opensearch_workflow prompt is available in the prompt picker and describes the full tool sequence.

Generic MCP clients

Any MCP-compatible client can connect via stdio and discover tools with tools/list. The opensearch_workflow prompt (available via prompts/list) describes the workflow. Tool docstrings also include prerequisite hints.

Without uv

If you prefer not to install uv, install dependencies manually and use Python directly:

pip install mcp opensearch-py
{
  "mcpServers": {
    "opensearch-orchestrator": {
      "command": "python3",
      "args": ["opensearch_orchestrator/mcp_server.py"],
      "cwd": "/path/to/agent-poc"
    }
  }
}

Release checklist

Build and validate before publishing:

# 1) bump version manually (not automatic)
#    update both files to the same value, e.g. 0.10.1
#    - pyproject.toml: [project].version
#    - opensearch_orchestrator/__init__.py: __version__
#
# optional sanity check:
python -c "import tomllib; p=tomllib.load(open('pyproject.toml','rb')); import opensearch_orchestrator as pkg; print('pyproject=', p['project']['version'], 'package=', pkg.__version__)"

# 2) all tests have to pass
uv run pytest -q

# 3) build and verify artifacts
uv build
python -m zipfile -l dist/*.whl
python -c "import opensearch_orchestrator.mcp_server as m; print(hasattr(m, 'main'))"
uvx --from dist/*.whl opensearch-orchestrator

# 4) upload to PyPI (needs a PyPI account + API token)
uv publish --token pypi-YOUR-TOKEN

Then publish to TestPyPI for smoke tests, followed by PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opensearch_orchestrator-0.2.0.tar.gz (2.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opensearch_orchestrator-0.2.0-py3-none-any.whl (2.3 MB view details)

Uploaded Python 3

File details

Details for the file opensearch_orchestrator-0.2.0.tar.gz.

File metadata

  • Download URL: opensearch_orchestrator-0.2.0.tar.gz
  • Upload date:
  • Size: 2.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for opensearch_orchestrator-0.2.0.tar.gz
Algorithm Hash digest
SHA256 51b677da38f5478d321b29e513073c6c5441e3419a56d6a21f9f079b6348ba75
MD5 e63e9761bf13a80242f4e5128ef9511b
BLAKE2b-256 150ee72ca26c978c8fb86482b36f81bf245c3fa585e70f1b179596ac345006d1

See more details on using hashes here.

File details

Details for the file opensearch_orchestrator-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: opensearch_orchestrator-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 2.3 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for opensearch_orchestrator-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 be1c68a98a8ec07416f6bf9636fb22714ba77dfe19d3843b49aa08e60136eb02
MD5 0de739e61b7c456e2760a5c2ec9efd0e
BLAKE2b-256 ffe1bef24ed725e7c0a00e76b333cbf7b23c34ec2976a97899e7b594558e2333

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page