NISQ-Optimized Machine Learning: Noise-resilient QML library with Federated and DP Support
Project description
NISQOptML
NISQ-Optimized Machine Learning: A noise-resilient Quantum Machine Learning library with Federated Learning and Differential Privacy support.
Overview
This library provides a comprehensive framework for building and training quantum neural networks on NISQ (Noisy Intermediate-Scale Quantum) devices. It addresses key challenges in quantum machine learning including noise mitigation, distributed execution, and privacy-preserving federated learning.
Key Features
- 🚀 Quantum Neural Networks (QNN) - Build and train variational quantum circuits
- 🛡️ Noise Mitigation - Automatic error mitigation and zero-noise extrapolation techniques
- 🌐 Federated Learning - Train models across distributed nodes with FedAvg
- 🔒 Differential Privacy - Add Gaussian noise for privacy-preserving training
- 📊 Explainability - Sensitivity analysis and parameter visualization tools
- ⚡ Distributed Execution - MPI-based distributed quantum circuit execution
Installation
pip install nisqoptml
Quick Start
Basic QNN Training
import nisqoptml as nq
import numpy as np
# Create a QNN model
model = nq.QNN(layers=2, qubits=4)
# Prepare training data
X = np.random.rand(10, 4)
y = np.random.rand(10, 4)
# Train the model
model.fit(X, y, epochs=10)
With Error Mitigation
# Enable automatic error mitigation
model = nq.QNN(layers=2, qubits=4, mitigation='auto')
model.fit(X, y, epochs=10)
# Or use zero-noise extrapolation
model = nq.QNN(layers=2, qubits=4, mitigation='zne')
model.fit(X, y, epochs=10)
Federated Learning with Differential Privacy
# Create federated QNN with DP noise
model = nq.QNN(
layers=2,
qubits=4,
federated=True,
dp_sigma=0.01, # DP noise standard deviation
mitigation='auto'
)
# Train with federated learning
local_X = np.random.rand(5, 4)
local_y = np.random.rand(5, 4)
model.federated_fit(local_X, local_y, rounds=5, local_epochs=2)
Distributed Execution
# Enable distributed circuit execution (requires MPI)
model = nq.QNN(layers=2, qubits=4, distributed=True)
model.fit(X, y, epochs=10, shots=1000)
Explainability
# Analyze parameter sensitivity
model = nq.QNN(layers=2, qubits=4)
model.fit(X, y, epochs=10)
# Generate sensitivity analysis plot
result = model.explain(noise_impact=True)
print(result) # Saves sensitivity.png
API Reference
QNN Class
QNN(
layers=2, # Number of quantum layers
qubits=4, # Number of qubits
mitigation='none', # 'none', 'auto', or 'zne'
distributed=False, # Enable distributed execution
federated=False, # Enable federated learning
dp_sigma=0.0 # DP noise standard deviation
)
Methods
fit(X, y, epochs=10, shots=1000)- Train the model on provided datafederated_fit(local_X, local_y, rounds=5, local_epochs=2)- Federated training with DP supportexplain(noise_impact=False)- Generate sensitivity analysis visualization
Requirements
- Python >= 3.8
- PennyLane >= 0.30
- Qiskit >= 1.0
- PyTorch >= 2.0
- NumPy, SymPy, Matplotlib
- MPI4Py (optional, for distributed execution)
Architecture
The library implements a variational quantum circuit architecture with:
- Data Encoding: Rotation gates (RY) for classical data embedding
- Variational Layers: Parameterized rotations (RZ) with entangling gates (CNOT)
- Measurement: Pauli-Z expectation values for all qubits
Error Mitigation
Two mitigation strategies are available:
- Auto: Neural network-based error prediction and correction
- ZNE: Zero-noise extrapolation using simple scaling factors
Federated Learning
Implements the Federated Averaging (FedAvg) algorithm:
- Local training on each node
- Optional differential privacy noise injection
- Global parameter aggregation via MPI
License
Apache License 2.0
Citation
If you use this library in your research, please cite:
@software{nisqoptml,
author = {Venkata Vikhyat Choppa},
title = {NISQOptML: NISQ-Optimized Machine Learning},
version = {0.1.0},
year = {2025},
url = {https://github.com/yourusername/nisqoptml}
}
Author
Venkata Vikhyat Choppa
Email: venkata_choppa@outlook.com
Github: https://github.com/vikhyatchoppa18
Acknowledgments
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nisqoptml-0.1.0.tar.gz.
File metadata
- Download URL: nisqoptml-0.1.0.tar.gz
- Upload date:
- Size: 9.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.14.0-33-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c079be1d2a4680e9e8f7e96c996daed4ce02b59d6d6e4d862dfd6d51efc4310b
|
|
| MD5 |
4b14c16de54f1cf0fe87ec96fbf3a541
|
|
| BLAKE2b-256 |
2031eeda6afa0d56169245fc90283c4815b69f89b6b85fb07e6dce906a2815a6
|
File details
Details for the file nisqoptml-0.1.0-py3-none-any.whl.
File metadata
- Download URL: nisqoptml-0.1.0-py3-none-any.whl
- Upload date:
- Size: 11.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.14.0-33-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a7fbae54ba8e3923e0eef21feb701e8ca301eced976085d3106142c25488bda6
|
|
| MD5 |
552dd5e148cb47aed973bbbb1024dc9c
|
|
| BLAKE2b-256 |
cec088d6b6400e19d0adfa3dbe30e0d5f989bda7951fa8fd94a2ffc0fb9f502f
|