OpenSearch Solution Architect MCP server — guides you from requirements to a running search setup
Project description
This repo uses the strands framework to build an OpenSearch semantic search solution architect agent. The agent collects user requirements and gives recommendations for index types.
There are two ways to use the agent: as a standalone interactive CLI or via an MCP server that any MCP-compatible client can drive.
Standalone Agent
Start the interactive orchestrator in a terminal:
python opensearch_orchestrator/orchestrator.py
The orchestrator guides you through sample collection, requirements gathering, solution planning, and execution — all in one interactive session.
MCP Server (Cursor, Claude Desktop, etc.)
The MCP server exposes the same orchestrator workflow as a set of phase tools. A client LLM drives the conversation with the user and calls the tools in order.
Prerequisites
Install uv (one-time, no sudo needed):
curl -LsSf https://astral.sh/uv/install.sh | sh
Running manually
uv run opensearch_orchestrator/mcp_server.py
uv reads the inline script metadata in opensearch_orchestrator/mcp_server.py and auto-installs dependencies into a cached virtual environment.
Running from PyPI (uvx)
After publishing to PyPI, run the MCP server without cloning the repo:
uvx opensearch-orchestrator@latest
If you install via pip, you can also run:
opensearch-orchestrator
Important: this command starts a stdio MCP server (JSON-RPC), not an interactive CLI. It should be launched by an MCP client such as Cursor, Claude Desktop, or MCP Inspector. If you want an interactive terminal workflow, run:
python opensearch_orchestrator/orchestrator.py
MCP workflow tools
The server exposes high-level phase tools that mirror the standalone orchestrator workflow:
| Tool | Phase | Description |
|---|---|---|
load_sample |
1 | Load a sample document (built-in, file, URL, index, or paste); localhost-index mode supports explicit auth mode/credentials |
set_preferences |
2 | Set budget, performance, query pattern, deployment preferences |
start_planning |
3 | Start the planning agent; returns initial architecture proposal |
refine_plan |
3 | Send user feedback to refine the proposal |
finalize_plan |
3 | Finalize the plan when the user confirms |
talk_to_client_llm |
3/4 | General MCP client-sampling bridge for client LLM turns |
set_plan_from_planning_complete |
3 | Parse/store a <planning_complete> planner response |
execute_plan |
4 | Return manual worker bootstrap payload (no server-side Bedrock execution in MCP) |
set_execution_from_execution_report |
4 | Parse/store normalized <execution_report> and update retry state |
retry_execution |
4 | Return resume bootstrap payload from last failed step |
cleanup |
Post | Remove test documents on user request |
The following execution/knowledge tools are exposed by default for manual client-driven execution:
create_index, create_and_attach_pipeline, create_bedrock_embedding_model,
create_local_pretrained_model, apply_capability_driven_verification,
launch_search_ui, set_search_ui_suggestions, read_knowledge_base,
read_dense_vector_models, read_sparse_vector_models, search_opensearch_org.
Advanced tools (set_plan, raw sample-submit variants, indexing helpers, etc.) are hidden by default and only exposed when OPENSEARCH_MCP_ENABLE_ADVANCED_TOOLS=true.
Localhost index auth contract (Option 3 / source_type="localhost_index"):
localhost_auth_mode="default": forceadmin/myStrongPassword123!localhost_auth_mode="none": force no authenticationlocalhost_auth_mode="custom": requirelocalhost_auth_username+localhost_auth_password
Planner backend in MCP mode:
- MCP planning uses client sampling / client LLM only (no Bedrock fallback in MCP mode).
- Manual fallback: if the MCP client does not support
sampling/createMessage,start_planningreturnsmanual_planning_required=trueplusmanual_planner_system_promptandmanual_planner_initial_input; run planner turns with the client LLM and callset_plan_from_planning_complete(planner_response).
Cursor integration
- Add the following to
.cursor/mcp.jsonin your workspace (adjustcwdto the repo path):
{
"mcpServers": {
"opensearch-orchestrator": {
"command": "uv",
"args": ["run", "opensearch_orchestrator/mcp_server.py"],
"cwd": "/path/to/agent-poc"
}
}
}
-
Reload the Cursor window (
Cmd+Shift+P→ "Developer: Reload Window"), then enable the server in Cursor Settings → MCP. -
A Cursor rule at
.cursor/rules/opensearch-workflow.mdcauto-activates when you ask about OpenSearch solution design and teaches the LLM the tool sequence.
If Cursor cannot find uv on its PATH, use the absolute path (e.g. ~/.local/bin/uv).
Claude Desktop integration
-
Copy
claude_desktop_config.example.jsonto your Claude Desktop config directory:- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
-
Edit the
cwdpath to point to this repo. -
Restart Claude Desktop. The
opensearch_workflowprompt is available in the prompt picker and describes the full tool sequence.
Generic MCP clients
Any MCP-compatible client can connect via stdio and discover tools with tools/list. The opensearch_workflow prompt (available via prompts/list) describes the workflow. Tool docstrings also include prerequisite hints.
Without uv
If you prefer not to install uv, install dependencies manually and use Python directly:
pip install mcp opensearch-py
{
"mcpServers": {
"opensearch-orchestrator": {
"command": "python3",
"args": ["opensearch_orchestrator/mcp_server.py"],
"cwd": "/path/to/agent-poc"
}
}
}
Release checklist
Build and validate before publishing:
# 1) bump version manually (not automatic)
# update both files to the same value, e.g. 0.10.1
# - pyproject.toml: [project].version
# - opensearch_orchestrator/__init__.py: __version__
#
# optional sanity check:
python -c "import tomllib; p=tomllib.load(open('pyproject.toml','rb')); import opensearch_orchestrator as pkg; print('pyproject=', p['project']['version'], 'package=', pkg.__version__)"
# 2) all tests have to pass
uv run pytest -q
# 3) build and verify artifacts
uv build
for whl in dist/*.whl; do python -m zipfile -l "$whl"; done
python -c "import opensearch_orchestrator.mcp_server as m; print(hasattr(m, 'main'))"
# pick wheel for the current package version (avoids selecting older builds)
VERSION="$(python -c "import tomllib; print(tomllib.load(open('pyproject.toml','rb'))['project']['version'])")"
WHEEL_PATH="$(ls dist/opensearch_orchestrator-${VERSION}-*.whl)"
uvx --from "$WHEEL_PATH" opensearch-orchestrator
# 4) upload to PyPI (needs a PyPI account + API token)
uv publish --token pypi-YOUR-TOKEN
Then publish to TestPyPI for smoke tests, followed by PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opensearch_orchestrator-0.2.2.tar.gz.
File metadata
- Download URL: opensearch_orchestrator-0.2.2.tar.gz
- Upload date:
- Size: 2.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c38c43b6e14131e9988b4b15b646cd56b6a9f85e04e2e358efb582fb2c8e907d
|
|
| MD5 |
d71a16dea5aad1f4aa3229547257ec06
|
|
| BLAKE2b-256 |
27f73d772a837f7776cce705d8582ee3954916bd1f9d5cb1517380c5aae3c163
|
File details
Details for the file opensearch_orchestrator-0.2.2-py3-none-any.whl.
File metadata
- Download URL: opensearch_orchestrator-0.2.2-py3-none-any.whl
- Upload date:
- Size: 2.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
63ac30d05aadd541f81104ab071a657cee1c56859424aec4bec48a184261ac65
|
|
| MD5 |
bb0ecc98e84e4b30c69b11f5bbdb72bc
|
|
| BLAKE2b-256 |
f640280ac43661bb6c34a338aa3a4e5caf90d2151d20230412507dd1e941472a
|