Skip to main content

MCP server to search private data in Vertex AI Search.

Project description

MCP Server for Vertex AI Search

MCP server to search private data in Vertex AI Search.

Tools

  • search: Search for Vertex AI Search and returns result chunks. Returns a dictionary with a "response" key. The value of "response" is a list of dictionaries, each containing the title of the source document and the extracted content chunk. Example:
{
  "response": [
    {
      "title": "Sample Document Title 1",
      "content": "Extracted text segment from the document."
    },
    {
      "title": "Sample Document Title 2",
      "content": "Another extracted text segment."
    }
  ]
}

Prerequisites

  1. Install uv from Astral or the GitHub README
  2. Install Python 3.13 using uv python install 3.13
  3. Create a Vertex AI Search app
    i. Official Document

Configuration

Add the following to your server configuration:

{
  "mcpServers": {
    "vais-mcp": {
      "command": "uvx",
      "args": ["vais-mcp@latest"],
      "env": {
        "GOOGLE_CLOUD_PROJECT_ID": "<google_cloud_project_id>",
        "VAIS_ENGINE_ID": "<vais_engine_id>"
      }
    }
  }
}

If you want to run with Docker, you will need to obtain a service account key beforehand and mount its path into the Docker container, configured within your mcp.json:

{
  "mcpServers": {
    "vais-mcp": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "GOOGLE_CLOUD_PROJECT_ID",
        "-e",
        "VAIS_ENGINE_ID",
        "-e",
        "USE_MOUNTED_SA_KEY",
        "-v",
        "/your/local/path/to/sa-key.json:/app/secrets/sa-key.json:ro",
        "mrmtsntr/vais-mcp:latest"
      ],
      "env": {
        "GOOGLE_CLOUD_PROJECT_ID": "<google_cloud_project_id>",
        "VAIS_ENGINE_ID": "<vais_engine_id>",
        "USE_MOUNTED_SA_KEY": "true"
      }
    }
  }
}

Note: When using Docker as shown above, ensure the local path /your/local/path/to/sa-key.json correctly points to your service account key file.

Note: You can find the Vertex AI Search engine ID in the app url.

https://console.cloud.google.com/gen-app-builder/locations/<location>/engines/<engine_id>/overview/system...

Optional Parameters

You can configure the following optional parameters in the environment or server configuration:

  • vais_location: The location of the Vertex AI Search engine. (Default: "global")
  • page_size: The number of documents to retrieve as search results. (Default: 5)
  • max_extractive_segment_count: The maximum number of extractive chunks to retrieve from each document. (Default: 2)
  • log_level: Specifies the logging level. (Default: "WARNING")
  • IMPERSONATE_SERVICE_ACCOUNT: The email address of a service account to impersonate for Google Cloud authentication. See the "Google Cloud Authentication" section for details.
  • USE_MOUNTED_SA_KEY: Set to true to indicate that a service account key file is mounted at /app/secrets/sa-key.json inside the container and should be used for authentication. (Default: false) If false, Application Default Credentials (ADC) will be used (unless IMPERSONATE_SERVICE_ACCOUNT is set and it also uses a mounted key as its source). If you set this to true, you must mount your local SA key file to /app/secrets/sa-key.json in the Docker container (e.g., using the -v /path/to/your/local-sa-key.json:/app/secrets/sa-key.json flag with docker run).

Example:

  "env": {
    "GOOGLE_CLOUD_PROJECT_ID": "<google_cloud_project_id>",
    "VAIS_ENGINE_ID": "<vais_engine_id>",
    "VAIS_LOCATION": "us-central1",
    "PAGE_SIZE": "20",
    "MAX_EXTRACTIVE_SEGMENT_COUNT": "8",
    "LOG_LEVEL": "DEBUG",
    "IMPERSONATE_SERVICE_ACCOUNT": "target-sa@project.iam.gserviceaccount.com",
    "USE_MOUNTED_SA_KEY": "true"
  }

Google Cloud Authentication

This MCP server authenticates to Google Cloud using the following methods, taking into account the IMPERSONATE_SERVICE_ACCOUNT and USE_MOUNTED_SA_KEY environment variables:

  • Service Account Impersonation:

    • If the IMPERSONATE_SERVICE_ACCOUNT environment variable is set to the email address of a target service account, the server will attempt to impersonate that service account.
      • If USE_MOUNTED_SA_KEY is true (and a service account key file is mounted to /app/secrets/sa-key.json in the container), the service account key file at /app/secrets/sa-key.json will be used as the source credentials for impersonation.
      • If USE_MOUNTED_SA_KEY is false, Application Default Credentials (ADC) will be used as the source credentials for impersonation.
  • Direct Authentication (No Impersonation):

    • If IMPERSONATE_SERVICE_ACCOUNT is not set:
      • If USE_MOUNTED_SA_KEY is true (and a service account key file is mounted to /app/secrets/sa-key.json), the server will directly use the service account key file at /app/secrets/sa-key.json for authentication.
      • If USE_MOUNTED_SA_KEY is false, the server will use ADC for authentication.

ADC automatically find credentials from the environment, such as your local user credentials (set up via gcloud auth application-default login) or a service account attached to the compute resource. For more details, see the official documentation.

When using Docker via mcp.json: If you set USE_MOUNTED_SA_KEY to "true" in the env section of your mcp.json configuration, and correctly mount your local service account key file to /app/secrets/sa-key.json using the -v flag within the args section, the mounted service account key will be used for authentication as described in the flows above.

Note:

  • The account used for authentication must have the "Discovery Engine Viewer" role (roles/discoveryengine.viewer). This is required to access Vertex AI Search resources. For more information about roles, see AI Applications roles and permissions.

  • If you are running locally, you can set up ADC by running:

    gcloud auth application-default login
    
  • For production environments, it is recommended to use a service account with the minimum required permissions.

Development

Building

To prepare this package for distribution:

  1. Sync dependencies and update lockfile:
uv sync

Debugging

You can launch the MCP Inspector using following command:

npx @modelcontextprotocol/inspector uvx vais-mcp@latest GOOGLE_CLOUD_PROJECT_ID=<google_cloud_project_id> VAIS_ENGINE_ID=<vais_engine_id>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vais_mcp-1.0.0.tar.gz (31.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vais_mcp-1.0.0-py3-none-any.whl (11.9 kB view details)

Uploaded Python 3

File details

Details for the file vais_mcp-1.0.0.tar.gz.

File metadata

  • Download URL: vais_mcp-1.0.0.tar.gz
  • Upload date:
  • Size: 31.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.10

File hashes

Hashes for vais_mcp-1.0.0.tar.gz
Algorithm Hash digest
SHA256 d28c4ba8c5dc3f89f30f0fada3fbb7b99023a7cb08c6f6ce6d77953a05c1d9f7
MD5 ae4479d281cafd5783574992298a94b8
BLAKE2b-256 2c9420a6bc4bc78246add4c6d5ad0782c055a4daaecd71f22f75c6652ab520a6

See more details on using hashes here.

File details

Details for the file vais_mcp-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: vais_mcp-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 11.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.10

File hashes

Hashes for vais_mcp-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 33d180aef804d13d191ecc0c0ffd50b678397628ba7eae6975360d18a3bee2c8
MD5 6b9d1cf2f7396247a588b8d4ab582cb0
BLAKE2b-256 175a2d036d3d65a877d272621302b0967ce90cb9f201e6e635a57297da68f953

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page