Skip to main content

NISQ-Optimized Machine Learning: Noise-resilient QML library with Federated and DP Support

Project description

NISQOptML

NISQ-Optimized Machine Learning: A noise-resilient Quantum Machine Learning library with Federated Learning and Differential Privacy support.

Overview

This library provides a framework for building and training quantum neural networks on NISQ (Noisy Intermediate-Scale Quantum) devices. It addresses challenges in quantum machine learning including noise mitigation, distributed execution, and privacy-preserving federated learning.

Key Features

  • Quantum Neural Networks (QNN) - Build and train variational quantum circuits
  • Noise Mitigation - Automatic error mitigation and zero-noise extrapolation techniques
  • Federated Learning - Train models across distributed nodes with FedAvg
  • Differential Privacy - Add Gaussian noise for privacy-preserving training
  • Explainability - Sensitivity analysis and parameter visualization tools
  • Distributed Execution - MPI-based distributed quantum circuit execution

Installation

pip install nisqoptml

Quick Start

Basic QNN Training

import nisqoptml as nq
import numpy as np

# Create a QNN model
model = nq.QNN(layers=2, qubits=4)

# Prepare training data
X = np.random.rand(10, 4)
y = np.random.rand(10, 4)

# Train the model
model.fit(X, y, epochs=10)

With Error Mitigation

# Enable automatic error mitigation
model = nq.QNN(layers=2, qubits=4, mitigation='auto')
model.fit(X, y, epochs=10)

# Or use zero-noise extrapolation
model = nq.QNN(layers=2, qubits=4, mitigation='zne')
model.fit(X, y, epochs=10)

Federated Learning with Differential Privacy

# Create federated QNN with DP noise
model = nq.QNN(
    layers=2, 
    qubits=4, 
    federated=True, 
    dp_sigma=0.01,  # DP noise standard deviation
    mitigation='auto'
)

# Train with federated learning
local_X = np.random.rand(5, 4)
local_y = np.random.rand(5, 4)
model.federated_fit(local_X, local_y, rounds=5, local_epochs=2)

Distributed Execution

# Enable distributed circuit execution (requires MPI)
model = nq.QNN(layers=2, qubits=4, distributed=True)
model.fit(X, y, epochs=10, shots=1000)

Explainability

# Analyze parameter sensitivity
model = nq.QNN(layers=2, qubits=4)
model.fit(X, y, epochs=10)

# Generate sensitivity analysis plot
result = model.explain(noise_impact=True)
print(result)  # Saves sensitivity.png

API Reference

QNN Class

QNN(
    layers=2,              # Number of quantum layers
    qubits=4,              # Number of qubits
    mitigation='none',     # 'none', 'auto', or 'zne'
    distributed=False,     # Enable distributed execution
    federated=False,       # Enable federated learning
    dp_sigma=0.0          # DP noise standard deviation
)

Methods

  • fit(X, y, epochs=10, shots=1000) - Train the model on provided data
  • federated_fit(local_X, local_y, rounds=5, local_epochs=2) - Federated training with DP support
  • explain(noise_impact=False) - Generate sensitivity analysis visualization

Requirements

  • Python >= 3.8
  • PennyLane >= 0.30
  • Qiskit >= 1.0
  • PyTorch >= 2.0
  • NumPy, SymPy, Matplotlib
  • MPI4Py (optional, for distributed execution)

Architecture

The library implements a variational quantum circuit architecture with:

  • Data Encoding: Rotation gates (RY) for classical data embedding
  • Variational Layers: Parameterized rotations (RZ) with entangling gates (CNOT)
  • Measurement: Pauli-Z expectation values for all qubits

Error Mitigation

Two mitigation strategies are available:

  1. Auto: Neural network-based error prediction and correction
  2. ZNE: Zero-noise extrapolation using simple scaling factors

Federated Learning

Implements the Federated Averaging (FedAvg) algorithm:

  • Local training on each node
  • Optional differential privacy noise injection
  • Global parameter aggregation via MPI

License

Apache License 2.0

Citation

If you use this library in your research, please cite:

@software{nisqoptml,
  author = {Venkata Vikhyat Choppa},
  title = {NISQOptML: NISQ-Optimized Machine Learning},
  version = {0.1.0},
  year = {2025},
  url = {https://github.com/VikhyatChoppa18/nisqoptml}
}

Author

Venkata Vikhyat Choppa
Email: venkata_choppa@outlook.com Github: https://github.com/vikhyatchoppa18

Acknowledgments

Built with PennyLane, Qiskit, and PyTorch.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nisqoptml-0.1.1.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nisqoptml-0.1.1-py3-none-any.whl (11.6 kB view details)

Uploaded Python 3

File details

Details for the file nisqoptml-0.1.1.tar.gz.

File metadata

  • Download URL: nisqoptml-0.1.1.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.14.0-33-generic

File hashes

Hashes for nisqoptml-0.1.1.tar.gz
Algorithm Hash digest
SHA256 b70bfff27d6f3ec76b718bafe8cead38a75f6790d3a54e856e1345f6d693911f
MD5 3f63dea647ab8aee004428f34a1e94cf
BLAKE2b-256 f2cf2acac2d96cec9ac8e54fca4832bddb8598820ccecb8e96bac992409d9c58

See more details on using hashes here.

File details

Details for the file nisqoptml-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: nisqoptml-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 11.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.14.0-33-generic

File hashes

Hashes for nisqoptml-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7415c658ac9858fb614f644e1cacaa9d918172af13005fcb350e2d27deff1b68
MD5 ed7d0886bf8d2fed1b5e46003d9d7788
BLAKE2b-256 df53f2b23ac2a34f68ce022de201f345c6f92bd26132278cde6804c7b502da49

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page