Skip to main content

Add your description here

Project description

SecureAI

SecureAI is a Python library that adds RATLS (Remote Attestation TLS) support to popular HTTP clients, including OpenAI SDK and httpx. It enables applications to cryptographically verify that AI inference and API services are running inside Trusted Execution Environments (TEEs) like Intel TDX before sending sensitive data.

The library transparently extends existing clients - simply specify which hostnames require TEE attestation, and SecureAI handles the verification automatically during the TLS handshake.

Installation

SecureAI uses uv for dependency management and building.

You can install SecureAI from PyPI or build it from source.

# From PyPI
uv pip install secureai

# From source
git clone https://github.com/concrete-security/secureai.git
cd secureai
uv build # to build the wheel
uv pip install dist/secureai-*.whl

What is RATLS?

Remote Attestation TLS (RATLS) extends standard TLS with hardware-based attestation to verify that a server is running inside a Trusted Execution Environment (TEE) like Intel TDX. This ensures your data is processed in a secure, isolated environment.

RATLS provides cryptographic proof that the client is communicating with the correct server identity (as defined in the TLS certificate) and that the server is running inside a TEE.

Context

The TEE server maintains an event log that records all significant operations, including TLS certificate renewals. When the server generates a new certificate (using keys created inside the TEE that never leave it), it appends an event to this log containing the certificate hash.

The TEE hardware uses these event logs to compute Runtime Measurements (RTMRs) - cryptographic hashes that reflect the entire state and history of the TEE. These RTMRs are included in the attestation quote and can be verified by clients to ensure the TEE is running expected software with the expected certificate.

How it works

  • Pre-RATLS Setup (happens before client connects): Server adds a certificate event to its event log whenever it renews its TLS certificate. This updates the RTMR3 register using the new certificate hash.
  • TLS Connection: Client establishes a standard TLS connection with the server and retrieves the TLS certificate.
  • Quote Request: Client sends random challenge data (64 bytes) and requests a cryptographic quote from the TEE.
  • Quote Response: Server generates and returns a quote signed by the TEE hardware, along with metadata:
    • Quote contains: random challenge data, runtime measurements (RTMRs)
    • Metadata contains: event log with TLS certificate hash
  • Verification: Client verifies:
    • Quote signature using the DCAP library
    • TLS certificate (current session) matches the one in the event log
    • Event log correctly produces the RTMRs by replaying all events
    • TEE measurements match expected values
    • TCB status is UpToDate
Client                                    Server (TEE)
  |----- Pre-RATLS ---------------------------|
  |                                           |
  |                                           |
  |                                     0. Append new event to the
  |                                        event log with cert hash
  |                                        when doing cert renewal
  |                                           |
  |                                           |
  |----- RATLS -------------------------------|
  |                                           |
  | 1. TLS Handshake                          |
  |<=========================================>|
  |   (Get TLS certificate)                   |
  |                                           |
  | 2. POST /tdx_quote                        |
  |    { report_data: <random_64_bytes> }     |
  |------------------------------------------>|
  |                                           |
  |                                     3. Generate Quote + Metadata
  |                                      - Quote include report_data, RTMRs, ...
  |                                      - Metadata include event_log containing cert hash
  |                                      - Sign with TEE hardware key
  |                                      - Other measurements
  | 4. Quote Response                         |
  |<------------------------------------------|
  |                                           |
  | 5. Client Verification                    |
  |  - Verify quote signature (DCAP)          |
  |  - Check report_data matches challenge    |
  |  - Check cert hash in event_log matches   |
  |  - Verify event_log by replaying RTMRs    |
  |  - Verify TCB status is UpToDate          |
  |  - Verify runtime measurements            |
  |                                           |
  | 6. Regular HTTPS requests                 |
  |    (if verification passed)               |
  |<=========================================>|

Server Requirements

For a server to support RATLS verification with SecureAI, it must:

  1. Run inside a TEE: Currently only Intel TDX is supported
  2. Maintain an event log: Record all significant operations including TLS certificate renewals with certificate hashes
  3. Provide a quote endpoint: Expose an HTTP POST endpoint (default: /tdx_quote) that:
    • Accepts JSON with report_data_hex field (64 bytes hex-encoded)
    • Returns a JSON response containing:
      • quote: TDX quote (hex-encoded) signed by TEE hardware
      • event_log: JSON array of events used to compute RTMRs
  4. Generate TLS certificates inside the TEE: Private keys must never leave the TEE
  5. Update RTMRs on certificate renewal: Append certificate hash events to the log, updating RTMR3

See the server implementation reference for a complete example.

Examples

You can set DEBUG_RATLS=true to see debug logs.

DstackTDXVerifier

DstackTDXVerifier is used to verify that a server is running inside a TDX TEE managed by Dstack. It verifies the TDX quote and optionally checks that the TEE is running a specific docker-compose configuration.

from secureai import httpx
from secureai.verifiers import DstackTDXVerifier

# Option 1: Verify TEE with runtime verification disabled (NOT RECOMMENDED)
# Only verifies that the server is running in a TEE, but not what application it runs
verifier = DstackTDXVerifier(disable_runtime_verification=True)

# Option 2: Verify TEE is running a specific docker-compose (RECOMMENDED)
# This ensures the TEE is running exactly the application you expect
with open("docker-compose.yml", "r") as f:
    docker_compose_content = f.read()

verifier = DstackTDXVerifier(docker_compose_file=docker_compose_content)

# Option 3: Provide a full app_compose configuration
app_compose = {
    "docker_compose_file": docker_compose_content,
    "manifest_version": 2,
    # ... other configuration options
}
verifier = DstackTDXVerifier(app_compose=app_compose)

# Use with httpx client
with httpx.Client(
    ratls_verifier_per_hostname={
        "your-tee-server.com": verifier
    }
) as client:
    response = client.get("https://your-tee-server.com/api")

OpenAI Client with RATLS

from secureai import OpenAI
from secureai.verifiers import DstackTDXVerifier

with open("you-docker-compose.yml", "r") as f:
    docker_compose_content = f.read()

verifier = DstackTDXVerifier(docker_compose_file=docker_compose_content)

client = OpenAI(ratls_verifier_per_hostname={"vllm.concrete-security.com": verifier})

HTTP Client with RATLS

from secureai import httpx
from secureai.verifiers import DstackTDXVerifier

with open("you-docker-compose.yml", "r") as f:
    docker_compose_content = f.read()

verifier = DstackTDXVerifier(docker_compose_file=docker_compose_content)

with httpx.Client(ratls_verifier_per_hostname={"vllm.concrete-security.com": verifier}) as client:
    # No RATLS as not in the list
    response = client.get("https://httpbin.org/get")
    print(f"Response status: {response.status_code}")
    
    # Uses RATLS
    response = client.get("https://vllm.concrete-security.com/health")
    print(f"Response status: {response.status_code}")
    
    # This shouldn't trigger another verification as the connection is still open
    response = client.get("https://vllm.concrete-security.com/v1/models")
    print(f"Response status: {response.status_code}")

Development

SecureAI uses uv for dependency management and building. There is also a Makefile with basic recipes.

Running Tests

# Run all tests
uv run pytest

or

make test # or test-coverage

Code Quality

# Format code
uv run ruff format

# Lint code
uv run ruff check

# For import order specifically
uv run ruff check --select I

or

make qa-all # or qa-all-fix

Build

# Build a wheel from source
uv build

Hardware Support

Only TDX is supported at the moment.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

secureai-0.2.0.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

secureai-0.2.0-py3-none-any.whl (22.4 kB view details)

Uploaded Python 3

File details

Details for the file secureai-0.2.0.tar.gz.

File metadata

  • Download URL: secureai-0.2.0.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for secureai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 336e61a196511245c7918592b88e92c49819deb579fcfd572d1ba9e336a440a9
MD5 65ac42d749546650a34bfbd5245faa24
BLAKE2b-256 806e43175de5591cb300a16aab85887f962a5de0107df2bf00011bf945cb24c1

See more details on using hashes here.

Provenance

The following attestation bundles were made for secureai-0.2.0.tar.gz:

Publisher: publish.yml on concrete-security/secureai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file secureai-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: secureai-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 22.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for secureai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 548c160e0245a85abd37183d6753c0e178f4818b7e0855b017f6e3044cd8e9d3
MD5 b83a1f1ab4f3bb0f29aaba86a8d11982
BLAKE2b-256 c50bdc531c5b19b71405c0817f71f1a1cc13c8b45f25a2ccc4ebcaf0a4a01926

See more details on using hashes here.

Provenance

The following attestation bundles were made for secureai-0.2.0-py3-none-any.whl:

Publisher: publish.yml on concrete-security/secureai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page