Skip to main content

Governance Engine by zeb labs

Project description

Z-GRC - Z Governance, Risk, Control Engine

Python Version Code Style: Ruff PyPI version PyPI Downloads

Enterprise-grade governance engine for Large Language Model applications. Provides automatic interception, policy enforcement, quota management, and comprehensive observability across multiple LLM providers with zero code changes.

Installation

uv add z-grc

Or with auto-instrumentation:

uv add z-grc[auto-instrument]

Quick Start

import zgrc
import boto3
import json

# Initialize GRC
zgrc.init(api_key="your-zgrc-api-key")

# Use your LLM SDK normally - GRC handles everything
client = boto3.client("bedrock-runtime", region_name="us-east-1")

response = client.invoke_model(
    modelId="us.anthropic.claude-sonnet-4-5-20250929-v1:0",
    body=json.dumps({
        "anthropic_version": "bedrock-2023-05-31",
        "max_tokens": 1024,
        "messages": [{"role": "user", "content": "Hello!"}]
    })
)

# Z-GRC automatically:
# - Validates quota before requests
# - Tracks token usage
# - Enforces policies
# - Sends telemetry (traces, metrics, logs)

Features

Zero-Code Integration

Drop-in solution requiring only zgrc.init(). Works with existing code without modifications.

Auto-Discovery

Automatically detects and intercepts installed LLM SDKs:

  • AWS Bedrock (boto3)
  • Anthropic (coming soon)
  • OpenAI (coming soon)
  • Azure OpenAI (coming soon)

Policy Enforcement

Real-time quota validation and token limit enforcement. Blocks requests when quota is exceeded.

from zgrc.utils import QuotaExceededException

try:
    response = client.invoke_model(...)
except QuotaExceededException as e:
    print(f"Quota exceeded: {e.used}/{e.limit} tokens")

Auto-Instrumentation

Optional automatic instrumentation for HTTP clients, web frameworks, databases, and more:

zgrc.init(
    api_key="your-zgrc-api-key",
    auto_instrument=True,
    app_name="my-app",
    environment="production"
)

Framework Agnostic

Works with vanilla SDKs and popular frameworks:

# PydanticAI
from pydantic_ai import Agent
agent = Agent("bedrock")
result = await agent.run("Your prompt")

# LangChain
from langchain_aws import ChatBedrock
llm = ChatBedrock(model_id="...")
response = llm.invoke("Your prompt")

# Strands Agents
from strands_agents import Agent
agent = Agent(provider="bedrock")
response = agent.execute("Your prompt")

Streaming Support

Fully supports streaming responses with automatic token tracking:

response = client.converse_stream(
    modelId="...",
    messages=[{"role": "user", "content": [{"text": "Tell me a story"}]}]
)

for event in response["stream"]:
    if "contentBlockDelta" in event:
        print(event["contentBlockDelta"]["delta"]["text"], end="")

Configuration

zgrc.init(
    api_key: str,                  # Your Z-GRC API key (required)
    auto_instrument: bool = False, # Enable auto-instrumentation
    app_name: str = None,          # Application name for telemetry
    environment: str = None,       # Environment (dev/staging/prod)
    log_level: int = logging.ERROR # Z-GRC internal log level
)

Building Executables

Build standalone executables with PyInstaller:

macOS/Linux

./build.sh

Output: dist/z-grc-proxy-macos-arm64 or dist/z-grc-proxy-linux-x86_64

Windows

build.bat

Output: dist/z-grc-proxy-windows-x64.exe

Test Executable

# macOS/Linux
./dist/z-grc-proxy-macos-arm64 --api-key=zgrc_xxx

# Windows
dist\z-grc-proxy-windows-x64.exe --api-key=zgrc_xxx

Note: Certificates auto-generate in ~/.mitmproxy/ on first run. Users must set HTTPS_PROXY and NODE_EXTRA_CA_CERTS environment variables.

Installing Executor

macOS / Linux

curl -fsSL https://raw.githubusercontent.com/zeb-ai/z-grc/main/install.sh | bash

Windows (PowerShell)

irm https://raw.githubusercontent.com/zeb-ai/z-grc/main/install.ps1 | iex

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

z_grc-0.0.17.tar.gz (25.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

z_grc-0.0.17-py3-none-any.whl (35.8 kB view details)

Uploaded Python 3

File details

Details for the file z_grc-0.0.17.tar.gz.

File metadata

  • Download URL: z_grc-0.0.17.tar.gz
  • Upload date:
  • Size: 25.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for z_grc-0.0.17.tar.gz
Algorithm Hash digest
SHA256 f6c8f854088ebc3951f8f8d4375fd1f6c0c3e6437afc03fe3cc5c9fbc4b88c88
MD5 57b6c2ce62ea3ba1a1769ea6a20dbb79
BLAKE2b-256 b4c927b0311f537bcaea2e621588ddd01d5e97846d8c222f6549dbdce3c9c79e

See more details on using hashes here.

Provenance

The following attestation bundles were made for z_grc-0.0.17.tar.gz:

Publisher: publish.yml on zeb-ai/z-grc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file z_grc-0.0.17-py3-none-any.whl.

File metadata

  • Download URL: z_grc-0.0.17-py3-none-any.whl
  • Upload date:
  • Size: 35.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for z_grc-0.0.17-py3-none-any.whl
Algorithm Hash digest
SHA256 8419a86a6f0d1519ed19b54bcc0c52a21a7ad16eb38843c8c9cc664c9de86eea
MD5 211c0c965446e138877fafcf03126967
BLAKE2b-256 36d9891b88864fe23c3c13a86fd070f9a0269982b49a59cd27cb1bf5596288fd

See more details on using hashes here.

Provenance

The following attestation bundles were made for z_grc-0.0.17-py3-none-any.whl:

Publisher: publish.yml on zeb-ai/z-grc

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page