Skip to main content

GLACIS SDK for Python - AI Compliance Attestation

Project description

Glacis

Glacis Python SDK

Tamper-proof audit logs for AI systems - without exposing sensitive data.

Note: Online attestation (Merkle tree proofs via the Glacis API) is not yet available. The SDK currently supports offline mode with local Ed25519 signing. Online mode will be enabled in a future release.

The Problem

You need to prove what your AI did for compliance, audits, or legal discovery. But sending prompts and responses to a logging service exposes sensitive data (PII, PHI, trade secrets).

The Solution

Glacis creates cryptographic proofs of AI operations. Your data stays local - only a SHA-256 hash is sent for witnessing.

Your Infrastructure              Glacis Log
┌─────────────────────┐         ┌─────────────────────┐
│ "Pt. Frodo Baggins  │         │ 7a3f8b2c...         │
│  has diabetes"      │  ──→    │ (64-char hash)      │
│                     │         │ + timestamp         │
│ (data stays here)   │         │ + Merkle proof      │
└─────────────────────┘         └─────────────────────┘

Later, you can prove the hash matches your local records without revealing the data itself.

Installation

pip install glacis[openai]      # For OpenAI
pip install glacis[anthropic]   # For Anthropic
pip install glacis[gemini]      # For Google Gemini
pip install glacis[controls]    # Add PII detection + jailbreak detection
pip install glacis[all]         # Everything

Quick Start

Option 1: Drop-in Wrapper (Recommended)

Replace your OpenAI/Anthropic/Gemini client with a wrapped version. Every API call is automatically attested.

import os
from glacis.integrations.openai import attested_openai, get_last_receipt

# Create wrapped client (offline mode - no Glacis account needed)
client = attested_openai(
    openai_api_key="sk-...",
    offline=True,
    signing_seed=os.urandom(32),
)

# Use exactly like the normal OpenAI client
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)

# Get the attestation receipt
receipt = get_last_receipt()
print(f"Attestation ID: {receipt.id}")

Works the same for Anthropic:

from glacis.integrations.anthropic import attested_anthropic, get_last_receipt

client = attested_anthropic(
    anthropic_api_key="sk-ant-...",
    offline=True,
    signing_seed=os.urandom(32),
)

And for Google Gemini:

from glacis.integrations.gemini import attested_gemini, get_last_receipt

client = attested_gemini(
    gemini_api_key="...",
    offline=True,
    signing_seed=os.urandom(32),
)

response = client.models.generate_content(
    model="gemini-2.5-flash",
    contents="Hello!"
)

receipt = get_last_receipt()

Option 2: Direct API

For custom attestations (non-OpenAI/Anthropic/Gemini, or manual control):

import os
from glacis import Glacis

glacis = Glacis(mode="offline", signing_seed=os.urandom(32))

receipt = glacis.attest(
    service_id="my-ai-app",
    operation_type="inference",
    input={"prompt": "Summarize this..."},
    output={"response": "The document..."},
)

Adding Controls

Detect PII/PHI and prompt injection attempts in your AI calls. Enable controls via a YAML config file:

client = attested_openai(
    openai_api_key="sk-...",
    offline=True,
    signing_seed=os.urandom(32),
    config_path="glacis.yaml",  # Enable controls via config
)

Control results (detections, scores, latencies) are included in the attestation record.

Configuration File

For persistent settings, create glacis.yaml:

version: "1.3"

attestation:
  offline: true
  service_id: my-ai-service

controls:
  input:
    pii_phi:
      enabled: true
      mode: fast            # "fast" (regex) or "full" (Presidio NER)
      if_detected: flag     # "forward", "flag", or "block"

    jailbreak:
      enabled: true
      threshold: 0.5
      if_detected: block

sampling:
  l1_rate: 1.0   # Evidence collection rate (0.0-1.0)
  l2_rate: 0.0   # Deep inspection rate (must be <= l1_rate)

Then:

client = attested_openai(
    openai_api_key="sk-...",
    config_path="glacis.yaml",
)

Retrieving Evidence

Full payloads are stored locally for audits:

from glacis.integrations.openai import get_last_receipt, get_evidence

receipt = get_last_receipt()
evidence = get_evidence(receipt.id)

print(evidence["input"])                  # Original input
print(evidence["output"])                 # Original output
print(evidence["control_plane_results"])  # PII/jailbreak results

Evidence is stored locally using SQLite (default) or JSONL backends.

Online vs Offline Mode

Online mode is not yet available. Use offline mode for now.

Feature Offline Online (coming soon)
Requires Glacis account No Yes
Signing Local Ed25519 Glacis witness
Third-party verifiable No Yes (Merkle proofs)
Use case Development, production Audits, regulatory

What Gets Sent to Glacis?

Data Sent?
Your prompts No (hash only)
Model responses No (hash only)
API keys No
service_id, operation_type Yes
Timestamps Yes

CLI

Verify a receipt:

python -m glacis verify receipt.json

Security

  • Hashing: SHA-256 with RFC 8785 canonical JSON (cross-runtime compatible)
  • Signing: Ed25519 via PyNaCl (libsodium)
  • Online mode: Merkle tree inclusion proofs (RFC 6962)

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

glacis-0.8.0.tar.gz (66.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

glacis-0.8.0-py3-none-any.whl (84.6 kB view details)

Uploaded Python 3

File details

Details for the file glacis-0.8.0.tar.gz.

File metadata

  • Download URL: glacis-0.8.0.tar.gz
  • Upload date:
  • Size: 66.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for glacis-0.8.0.tar.gz
Algorithm Hash digest
SHA256 cdff0acc64d89e198a393e6165e723a24db086dd0f2112fc3c6b0bf5f468e60c
MD5 38e3d0ccdfc1462bac8a2002afd1eb3b
BLAKE2b-256 be137e6f1c67d86aa455d532646c94c457429d34a94ff99a7c52847a40f01bdf

See more details on using hashes here.

Provenance

The following attestation bundles were made for glacis-0.8.0.tar.gz:

Publisher: publish.yml on Glacis-io/glacis-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file glacis-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: glacis-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 84.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for glacis-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 00ead32035e7dfd9da170d09c59dc83a92de6d607387cd3c7a8dcddf47b71d69
MD5 f6c50d40dbffa6d4a88c8908b4bb4a92
BLAKE2b-256 d1f4f8067253225a20ccf6b77573c8663a5300bd8d12b8193dca7d0690fbc594

See more details on using hashes here.

Provenance

The following attestation bundles were made for glacis-0.8.0-py3-none-any.whl:

Publisher: publish.yml on Glacis-io/glacis-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page