Low-level Python bindings for the Rubrik Security Cloud GraphQL API
Project description
rubrik-security-cloud-python-graphql-client
Python client for the Rubrik Security Cloud (RSC) GraphQL API. Provides authenticated GraphQL execution via sgqlc, with OAuth2 token management and a generated typed schema so you never have to write raw GraphQL strings.
Installation
pip install rsc-client
To install directly from this repo:
pip install git+https://github.com/rubrikinc/rubrik-security-cloud-python-graphql-client.git
Authentication
Service account file (recommended)
Download a service account JSON file from the RSC UI (Access Control → Service Accounts) and pass it to the client:
from rsc import RSCClient
client = RSCClient(service_account_file="~/Downloads/my-service-account.json")
Or set an environment variable and call RSCClient() with no arguments:
export RSC_SERVICE_ACCOUNT_FILE=~/Downloads/my-service-account.json
~/.rsc/config.json
{
"url": "https://myaccount.my.rubrik.com",
"client_id": "client|...",
"client_secret": "..."
}
You can also point this file at a service account file:
{
"service_account_file": "/path/to/service-account.json"
}
Environment variables
| Variable | Description |
|---|---|
RSC_SERVICE_ACCOUNT_FILE |
Path to a service account JSON file |
RSC_URL |
RSC base URL |
RSC_CLIENT_ID |
OAuth2 client ID |
RSC_CLIENT_SECRET |
OAuth2 client secret |
Precedence: RSC_SERVICE_ACCOUNT_FILE → RSC_URL/RSC_CLIENT_ID/RSC_CLIENT_SECRET → ~/.rsc/config.json
Usage
RSCClient.execute() accepts either a raw GraphQL string or an sgqlc Operation. The sgqlc approach is recommended — it gives you typed, auto-completed Python objects and catches field name errors before the request is sent.
Query example — list SLA domains
| Raw GraphQL string | sgqlc Operation |
|---|---|
result = client.execute("""
query {
slaDomains {
nodes {
id
name
}
}
}
""")
for node in result['data']['slaDomains']['nodes']:
print(node['id'], node['name'])
|
from sgqlc.operation import Operation
from rsc.schema import Query
op = Operation(Query)
nodes = op.sla_domains().nodes()
nodes.__fields__('id', 'name')
result = client.execute(op)
# Deserialize into typed objects
data = (op + result).sla_domains
for node in data.nodes:
print(node.id, node.name)
|
Mutation example — assign an SLA domain
| Raw GraphQL string | sgqlc Operation |
|---|---|
result = client.execute("""
mutation {
assignSla(input: {
objectIds: ["<object-id>"],
slaDomainAssignType: PROTECT,
slaOptionalId: "<sla-id>"
}) {
success
}
}
""")
print(result['data']['assignSla']['success'])
|
from sgqlc.operation import Operation
from rsc.schema import Mutation, AssignSlaInput, SlaAssignTypeEnum
op = Operation(Mutation)
result_field = op.assign_sla(input=AssignSlaInput(
object_ids=["<object-id>"],
sla_domain_assign_type=SlaAssignTypeEnum.PROTECT,
sla_optional_id="<sla-id>",
))
result_field.__fields__('success')
result = client.execute(op)
data = (op + result).assign_sla
print(data.success)
|
Discovery index
The package ships two pre-generated JSON indexes built from the GraphQL SDL: mcp_index.json (all queries and mutations with their argument signatures) and mcp_types.json (all named types with their fields, enum values, or union members). These are parsed once at import time and cached in memory.
Why it exists
A common mistake when building an MCP server for a GraphQL API is to create one MCP tool per operation — a pattern that produces thousands of redundant tools and defeats the purpose of both technologies. GraphQL was designed so that a single endpoint can express any query or mutation; MCP tools should reflect that by exposing a small, generic surface: one tool to search operations, one to describe an operation, one to execute it. The LLM then does what it's good at — using those tools to discover and compose the right call at runtime.
The discovery index makes this practical. The RSC schema is large, and an LLM needs a fast way to answer "what operations exist and how do I call them?" without parsing the raw SDL on every request. The indexes are pre-built by CI whenever the schema changes and committed into the package, so discovery works instantly with no credentials, no network access, and no heavy runtime dependencies.
Functions
from rsc import (
search_operations, # full-text search across names + descriptions
describe_operation, # full argument signature for one operation
describe_type, # fields/values for any named type
list_queries, # all query names
list_mutations, # all mutation names
list_types, # all type names
)
search_operations(search, operation_type="all")
Case-insensitive substring search across operation names and descriptions. Useful for finding the right operation when you know roughly what you're looking for.
search_operations("snapshot", "query")
# [{"name": "...", "type": "query", "description": "...", "return_type": "..."}, ...]
search_operations("assign", "mutation")
describe_operation(name, operation_type)
Returns the full argument signature for a single query or mutation. Operation names are camelCase as they appear in GraphQL (e.g. vSphereVmNewConnection).
op = describe_operation("slaDomains", "query")
# {
# "name": "slaDomains",
# "type": "query",
# "description": "...",
# "return_type": "SlaDomainConnection",
# "args": {
# "filter": {"type": "[Filter!]", "description": "..."},
# ...
# }
# }
describe_type(name)
Returns the fields (with types and descriptions) for object/input/interface types, the possible values for enums, or the member types for unions.
describe_type("CreateGlobalSlaInput")
# {"name": "CreateGlobalSlaInput", "kind": "input", "fields": {"name": {"type": "String!", ...}, ...}}
describe_type("SlaAssignTypeEnum")
# {"name": "SlaAssignTypeEnum", "kind": "enum", "values": ["PROTECT", "UNPROTECT", ...]}
Keeping the index in sync
The indexes are regenerated automatically by the CI workflow whenever a new schema file is added. To regenerate locally after adding a schema or modifying mcp_indexer.py:
PYTHONPATH=src python3 -m rsc.mcp_indexer
Then commit the updated mcp_index.json and mcp_types.json.
Token caching
Tokens are cached in ~/.rsc/token_cache_<hash>.json (0600 permissions) and reused until 60 seconds before expiry. Short-lived callers like cron jobs or Telegraf scripts won't re-authenticate on every run. Cache files are keyed by a hash of the RSC URL so multiple accounts on the same machine stay isolated.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file rsc_client-1.1.20260413.tar.gz.
File metadata
- Download URL: rsc_client-1.1.20260413.tar.gz
- Upload date:
- Size: 2.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a0b6fc8318198c7c229dbd8263bc6cc93df59a076abba0e27cf87fc332891985
|
|
| MD5 |
8ea04ce7ea8e7b3de26729f8cee03d05
|
|
| BLAKE2b-256 |
aafec065f77b77a0545a2048bd3c5785c6515e6a9cc5a71ffac72714c8bbe30c
|
Provenance
The following attestation bundles were made for rsc_client-1.1.20260413.tar.gz:
Publisher:
publish.yml on rubrikinc/rubrik-security-cloud-python-graphql-client
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rsc_client-1.1.20260413.tar.gz -
Subject digest:
a0b6fc8318198c7c229dbd8263bc6cc93df59a076abba0e27cf87fc332891985 - Sigstore transparency entry: 1418741421
- Sigstore integration time:
-
Permalink:
rubrikinc/rubrik-security-cloud-python-graphql-client@d150de6b00b845e369e8ccf2ea4061fca8e2f99c -
Branch / Tag:
refs/tags/1.1.20260413 - Owner: https://github.com/rubrikinc
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d150de6b00b845e369e8ccf2ea4061fca8e2f99c -
Trigger Event:
push
-
Statement type:
File details
Details for the file rsc_client-1.1.20260413-py3-none-any.whl.
File metadata
- Download URL: rsc_client-1.1.20260413-py3-none-any.whl
- Upload date:
- Size: 1.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
12795a396fe2ca997d76798cb70e83b158290cf925e5907c989585445068e711
|
|
| MD5 |
6de3e9f8b8e86d540005b78f9ce4837f
|
|
| BLAKE2b-256 |
8af38be17be7e38adf42172555268d93f070b48ae0b8a1a7b155672646a8c04b
|
Provenance
The following attestation bundles were made for rsc_client-1.1.20260413-py3-none-any.whl:
Publisher:
publish.yml on rubrikinc/rubrik-security-cloud-python-graphql-client
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rsc_client-1.1.20260413-py3-none-any.whl -
Subject digest:
12795a396fe2ca997d76798cb70e83b158290cf925e5907c989585445068e711 - Sigstore transparency entry: 1418741521
- Sigstore integration time:
-
Permalink:
rubrikinc/rubrik-security-cloud-python-graphql-client@d150de6b00b845e369e8ccf2ea4061fca8e2f99c -
Branch / Tag:
refs/tags/1.1.20260413 - Owner: https://github.com/rubrikinc
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d150de6b00b845e369e8ccf2ea4061fca8e2f99c -
Trigger Event:
push
-
Statement type: