Skip to main content

Client SDK for BlindAI Confidential Inference Server

Reason this release was yanked:

Broken grpcio (thanks google!)

Project description

BlindAI Client

BlindAI Client is a python library to create client applications for BlindAI Server (Mithril-security's confidential inference server).

If you wish to know more about BlindAI, please have a look to the project Github repository.

Installation

Using pip

$ pip install blindai

Usage

Uploading a model

from transformers import DistilBertTokenizer
import blindai
from blindai import ModelDatumType
import torch

# Create dummy input for export
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
sentence = "I love AI and privacy!"
inputs = tokenizer(sentence, padding="max_length", max_length=8, return_tensors="pt")[
    "input_ids"
]

# Export the model
torch.onnx.export(
    tokenizer,
    inputs,
    "./distilbert-base-uncased.onnx",
    export_params=True,
    opset_version=11,
    input_names=["input"],
    output_names=["output"],
    dynamic_axes={"input": {0: "batch_size"}, "output": {0: "batch_size"}},
)

# Launch client
with blindai.connect(
    addr="localhost", policy="policy.toml", certificate="host_server.pem"
) as client:
    response = client.upload_model(
        model="./distilbert-base-uncased.onnx",
        shape=inputs.shape,
        dtype=ModelDatumType.I64,
    )
    model_id = response.model_id

Uploading data

from transformers import DistilBertTokenizer
import blindai

# Prepare the inputs
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
sentence = "I love AI and privacy!"
inputs = tokenizer(sentence, padding="max_length", max_length=8)["input_ids"]

# Load the client
with blindai.connect(
    addr="localhost", policy="policy.toml", certificate="host_server.pem"
) as client:
    # Get prediction
    response = client.run_model(model_id, inputs)
print(response.output)

In order to connect to the BlindAI server, the client needs to acquire the following files from the server:

  • policy.toml : the enclave security policy that defines which enclave is trusted (if you are not using the simulation mode).

  • host_server.pem : TLS certificate for the connection to the untrusted (app) part of the server.

Simulation mode enables to bypass the process of requesting and checking the attestation and will ignore the TLS certificate.

Before you run an example, make sure to get policy.toml and host_server.pem (if you are not using the simulation mode) that are generated in the server side.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

This project is licensed under Apache 2.0 License.

The project uses the "Intel SGX DCAP Quote Validation Library" for attestation verification, See Intel SGX DCAP Quote Validation Library License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

blindai-0.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

blindai-0.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

blindai-0.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

blindai-0.5.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

blindai-0.5.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page