A high-performance Python library for fully homomorphic encryption (FHE) in deep learning, seamlessly integrated with PyTorch.
Project description
Supersayan
A high‑performance Python library that integrates Fully Homomorphic Encryption (FHE) into PyTorch workflows. Supersayan provides:
- A client/server path to offload selected layers to a remote FHE server (hybrid mode).
- Zero‑copy tensor bridges across PyTorch, NumPy, CuPy, and Julia.
The Julia backend is managed via juliacall and initialized automatically on first import, or explicitly with a helper CLI.
Installation
From PyPI:
pip install supersayan
# or with uv
uv add supersayan
Default behavior:
- After install, importing
supersayantriggers a one‑time Julia backend setup automatically. No extra step is needed for typical users.
Optional (CI/Docker or troubleshooting):
# Manually initialize the Julia backend if you want explicit control
supersayan-setup
Advanced control (not recommended for regular users):
- To prevent network access during import in CI images, set
SUPERSAYAN_SKIP_JULIA_SETUP=1and runsupersayan-setupin a controlled build step.
Notes:
- GPU support uses CuPy when available; code runs on CPU otherwise.
Hybrid Remote Inference
Run the TCP server:
python scripts/run_server.py --host 127.0.0.1 --port 8000 --models-dir /tmp/supersayan/models
Use the client to execute only selected layers remotely in FHE while keeping other ops local:
import torch
import torch.nn as nn
from supersayan.core.types import SupersayanTensor
from supersayan.remote import SupersayanClient
class SmallCNN(nn.Module):
def __init__(self):
super().__init__()
self.conv = nn.Conv2d(1, 4, 3, padding=1)
self.fc = nn.Linear(4*28*28, 10)
def forward(self, x):
x = torch.relu(self.conv(x))
return self.fc(x.view(x.size(0), -1))
cnn = SmallCNN().eval()
client = SupersayanClient(
server_url="127.0.0.1:8000",
torch_model=cnn,
fhe_modules=[nn.Conv2d, nn.Linear], # offload these layers
)
x = SupersayanTensor(torch.randn(1, 1, 28, 28))
y = client(x)
For a runnable example of the TCP server, see scripts/run_server.py.
Supported offloaded layers: nn.Linear, nn.Conv2d.
Tensors and Interop
SupersayanTensor(data, device=...)acceptstorch.Tensor,numpy.ndarray, orcupy.ndarrayand preserves dtype float32.- Helpers:
SupersayanTensor.zeros(...),ones(...),randn(...). - Interop:
.to_numpy(),.to_dlpack(), and zero‑copy conversion to Julia via.to_julia().
Project Layout
src/supersayan/core, layers, remote client/server, Julia backendscripts/runnable examples (run_server.py)
Troubleshooting
- If Julia setup fails on first import, run
supersayan-setupmanually. - In CI or headless environments, set
SUPERSAYAN_SKIP_JULIA_SETUP=1during import and runsupersayan-setupexplicitly in a build step.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file supersayan-0.1.2.tar.gz.
File metadata
- Download URL: supersayan-0.1.2.tar.gz
- Upload date:
- Size: 22.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
168351adee5f90f14cc544e05c060ef9b65934ebe736d3f60c61ab4b2c6c4f21
|
|
| MD5 |
0a222d9e4c4a57404ef883d1e64cf621
|
|
| BLAKE2b-256 |
a240433eb0ca8bd4fa5236ccc44b5c0547be476a819cba1b6ab251c93edf1b45
|
File details
Details for the file supersayan-0.1.2-py3-none-any.whl.
File metadata
- Download URL: supersayan-0.1.2-py3-none-any.whl
- Upload date:
- Size: 26.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.6.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9ec7d47ab6987079f938d649ccb502bcd5561c4d62032094a05ce6411c58c063
|
|
| MD5 |
9bd66fe0c12e74f8fb46a03ea17f43fd
|
|
| BLAKE2b-256 |
a4292dfcf7719ed296c5b90fddb78252210d9832f9d57836540e9569ae6865da
|