Skip to main content

Client SDK for BlindAI Confidential Inference Server

Project description

BlindAI Client

BlindAI Client is a python library to create client applications for BlindAI Server (Mithril-security's confidential inference server).

If you wish to know more about BlindAI, please have a look to the project Github repository.


Using pip

$ pip install blindai


Uploading a model

from transformers import DistilBertTokenizer, DistilBertForSequenceClassification
import torch
from blindai.client import BlindAiClient, ModelDatumType

# Get pretrained model
model = DistilBertForSequenceClassification.from_pretrained("distilbert-base-uncased")

# Create dummy input for export
tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")
sentence = "I love AI and privacy!"
inputs = tokenizer(sentence, padding = "max_length", max_length = 8, return_tensors="pt")["input_ids"]

# Export the model
	model, inputs, "./distilbert-base-uncased.onnx",
	export_params=True, opset_version=11,
	input_names = ['input'], output_names = ['output'],
	dynamic_axes={'input' : {0 : 'batch_size'},
	'output' : {0 : 'batch_size'}})

# Launch client
client = BlindAiClient()
client.upload_model(model="./distilbert-base-uncased.onnx", shape=inputs.shape, dtype=ModelDatumType.I64)

Uploading data

from transformers import DistilBertTokenizer
from blindai.client import BlindAiClient

tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased")

sentence = "Hello, my dog is cute"
inputs = tokenizer(sentence, padding = "max_length", max_length = 8)["input_ids"]

client = BlindAiClient()

#Send the data and run the inference
response = client.run_model(inputs)

In order to connect to the BlindAI server, the client needs to acquire the following files from the server:

  • policy.toml : the enclave security policy that defines which enclave is trusted (if you are not using the simulation mode).

  • host_server.pem : TLS certificate for the connection to the untrusted (app) part of the server.

Simulation mode enables to bypass the process of requesting and checking the attestation and will ignore the TLS certificate.

Before you run an example, make sure to get policy.toml and host_server.pem (if you are not using the simulation mode) that are generated in the server side.


Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.


This project is licensed under Apache 2.0 License.

The project uses the "Intel SGX DCAP Quote Validation Library" for attestation verification, See Intel SGX DCAP Quote Validation Library License

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

blindai-0.2.0.post1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.9 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.12+ x86-64

blindai-0.2.0.post1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64

blindai-0.2.0.post1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.9 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

blindai-0.2.0.post1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

blindai-0.2.0.post1-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page