Skip to main content

Client SDK for blindai confidential inference server

Project description

BlindAI Client

Client SDK for blindai confidential inference server.

Installation

Using pip

$ pip install blindai

Usage

Uploading a model

from blindai.client import BlindAiClient

#Create the connection
client = BlindAiClient()
client.connect_server(
    "localhost",
    policy="policy.toml",
    certificate="host_server.pem",
    simulation=False
)

#Upload the model to the server
response = client.upload_model(model="./mobilenetv2-7.onnx", shape=(1, 3, 224, 224), datum=client.ModelDatumType.F32)

Uploading data

from blindai.client import BlindAiClient
from PIL import Image
import numpy as np

#Create the connection
client = BlindAiClient()
client.connect_server(
    "localhost",
    policy="policy.toml",
    certificate="host_server.pem",
    simulation=False
)

image = Image.open("grace_hopper.jpg").resize((224,224))
a = np.asarray(image, dtype=float)

#Send data for inference
result = client.run_model(a.flatten())

In order to connect to the BlindAI server, the client needs to acquire the following files from the server:

  • policy.toml : the enclave security policy that defines which enclave is trusted (if you are not using the simulation mode).

  • host_server.pem : TLS certificate for the connection to the untrusted (app) part of the server.

Simulation mode enables to pypass the process of requesting and checking the attestation.

Usage examples can be found in tutorial folder.

Before you run an example, make sure to get policy.toml(if you are not using the simulation mode) and host_server.pem that are generated in the server side.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

This project is licensed under Apache 2.0 License. The project uses the "Intel SGX DCAP Quote Validation Library" for attestation verification, See Intel SGX DCAP Quote Validation Library License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

blindai-0.1.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.9 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.12+ x86-64

blindai-0.1.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.9 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64

blindai-0.1.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.9 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

blindai-0.1.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

blindai-0.1.0-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page