Skip to main content

SDK for calling/providing an encrypted inference service

Project description

Encrypted inference SDK

This package allows calling a remotely deployed inferencing service that operates on encrypted input data. The data is encrypted using homomorphic encryption with the Microsoft SEAL library. This SDK handles calls to Microsoft SEAL to create a secret key and encrypt the input data, returning an encrypted query as a byte array. The byte array must then be communicated to a server component, where a model is deployed. The server performs the prediction, obtaining an encrypted result, which can be decrypted by this SDK with the secret key. In this process, the server is guaranteed to never learn the client's query.

Project details


Release history Release notifications | RSS feed

This version

0.9

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

encrypted-inference-0.9.tar.gz (899.5 kB view hashes)

Uploaded Source

Built Distribution

encrypted_inference-0.9-py3-none-any.whl (905.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page