Skip to main content

A framework of quantum-inspired classical optimizers for machine learning

Project description

QOptLib: Quantum-Inspired Optimizers

A framework of quantum-inspired classical optimizers for machine learning.

Installation

pip install qoptlib
# or for development
pip install -e .

Quick Start

NumPy (Core Optimizers)

import numpy as np
from qoptlib import QuantumAdam

params = [np.random.randn(10, 5).astype(np.float32)]
optimizer = QuantumAdam(params, lr=0.001, quantum_strength=0.2)

def get_grads():
    return [np.random.randn(10, 5).astype(np.float32) * 0.1]

for _ in range(100):
    optimizer.step(get_grads)

PyTorch (via Adapter)

from quantopt.opt import QuantumAdam
from quantopt.adapters import TorchAdapter
import torch
import torch.nn as nn

model = nn.Sequential(
    nn.Linear(10, 64),
    nn.ReLU(),
    nn.Linear(64, 1)
)

adapter = TorchAdapter(model)
optimizer = QuantumAdam(lr=0.001, quantum_strength=0.2)

# Run optimization
best_weights, best_loss = adapter.optimize(
    optimizer,
    loss_fn=lambda out, tgt: ((out - tgt) ** 2).mean(),
    dataset=torch.utils.data.TensorDataset(
        torch.randn(100, 10),
        torch.randn(100, 1)
    ),
    iterations=50
)

TensorFlow (via Adapter)

from quantopt.quantopt import QuantumAdam
from quantopt.adapters import TensorFlowAdapter
import tensorflow as tf

model = tf.keras.Sequential([
    tf.keras.layers.Dense(64, activation='relu', input_shape=(10,)),
    tf.keras.layers.Dense(1)
])

adapter = TensorFlowAdapter(model)
optimizer = QuantumAdam(lr=0.001, quantum_strength=0.2)

best_weights, best_loss = adapter.optimize(
    optimizer,
    loss_fn=lambda y_true, y_pred: tf.keras.losses.mse(y_true, y_pred),
    dataset=tf.data.Dataset.from_tensor_slices((
        tf.random.normal((100, 10)),
        tf.random.normal((100, 1))
    )).batch(32),
    iterations=50
)

Structure

quantopt/
├── quantopt/              # CORE: NumPy implementations
│   ├── __init__.py       # Exports: QuantumSGD, QuantumAdam, QuantumRMSprop, QuantumTunneling
│   ├── base.py          # BaseOptimizer
│   ├── sgd.py          # QuantumSGD
│   ├── adam.py          # QuantumAdam
│   ├── rmsprop.py       # QuantumRMSprop
│   └── tunneling.py     # QuantumTunneling
│
├── adapters/            # Framework bridges
│   ├── __init__.py     # Lazy imports
│   ├── torch.py        # TorchAdapter
│   └── tensorflow.py   # TensorFlowAdapter
│
├── benchmarks/         # Test functions
├── examples/         # Usage examples
└── tests/           # Test suite

Core Optimizers

Optimizer Description
QuantumSGD SGD with quantum noise
QuantumAdam Adam with quantum phase
QuantumRMSprop RMSprop with tunneling
QuantumTunneling Escapes local minima

Parameters

Parameter Description Default
lr Learning rate optimizer-specific
quantum_strength Quantum effect (0-1) 0.1
momentum Momentum factor 0.0
weight_decay L2 regularization 0.0

Adapters

TorchAdapter

from quantopt.adapters import TorchAdapter
from quantopt.quantopt import QuantumAdam

adapter = TorchAdapter(model)
optimizer = QuantumAdam(lr=0.001)

best_weights, best_loss = adapter.optimize(
    optimizer,
    loss_fn,
    dataset,
    iterations=100,
    verbose=True
)

TensorFlowAdapter

from quantopt.adapters import TensorFlowAdapter
from quantopt.quantopt import QuantumAdam

adapter = TensorFlowAdapter(model)
optimizer = QuantumAdam(lr=0.001)

best_weights, best_loss = adapter.optimize(
    optimizer,
    loss_fn,
    dataset,
    iterations=100
)

API

Core (NumPy)

from quantopt import QuantumAdam
from quantopt.quantopt import QuantumSGD, QuantumRMSprop, QuantumTunneling

# All have:
opt.step(grad_fn)      # Take step
opt.state_dict()       # Get state
opt.load_state_dict(d) # Load state
opt.get_lr()          # Get LR
opt.set_lr(lr)        # Set LR

Adapters

from quantopt.adapters import TorchAdapter, TensorFlowAdapter

adapter = Adapter(model)
adapter.get_weights()          # Get flat weights
adapter.set_weights(w)         # Set weights
adapter.get_bounds()           # Get bounds
adapter.optimize(optimizer, loss_fn, dataset)

Tests

pytest tests/ -v

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qoptlib-0.1.0.tar.gz (21.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qoptlib-0.1.0-py3-none-any.whl (25.0 kB view details)

Uploaded Python 3

File details

Details for the file qoptlib-0.1.0.tar.gz.

File metadata

  • Download URL: qoptlib-0.1.0.tar.gz
  • Upload date:
  • Size: 21.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qoptlib-0.1.0.tar.gz
Algorithm Hash digest
SHA256 959323915b0ca5ee143c8b8e79019d397659019901157ebd4a1337cd8118be2b
MD5 5ed2f8099b7007d163f70b8af3d4ff03
BLAKE2b-256 75adad3793cdbf792074d1654b1e474ac6b4fcdfb4caace680352d873b8c1032

See more details on using hashes here.

Provenance

The following attestation bundles were made for qoptlib-0.1.0.tar.gz:

Publisher: python-publish.yml on rehanguha/qoptlib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file qoptlib-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: qoptlib-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 25.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qoptlib-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0fc71efd56297768d0b17a924d414155eb487d44331c7fe54a0546723c6990f9
MD5 9b0a8cf3e2e548995cdfb5823d560e4a
BLAKE2b-256 4feb0038fc3e6510e930bca7e58b092a35e41a5b1f5ce60dc0f6d852c0384076

See more details on using hashes here.

Provenance

The following attestation bundles were made for qoptlib-0.1.0-py3-none-any.whl:

Publisher: python-publish.yml on rehanguha/qoptlib

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page