Skip to main content

Generative quantum circuits

Project description

genQC ยท Generative Quantum Circuits

python-3.12 DOI https://florianfuerrutter.github.io/genQC huggingface.co/collections/Floki00 Online_Demo

Code repository for generating quantum circuits with diffusion models.

Generation process for 4-qubit QFT.

๐Ÿ“ฐ News

The codebase

The code contained within this repo allows the sampling of pre-trained diffusion models and includes our pipeline to fine-tune and train models from scratch. Pre-trained weights can be found on [Hugging Face] and can be downloaded automatically via our code (see minimal example). For the text CLIP model weights we use the OpenCLIP library, which will download (and cache) the CLIP model on first usage of our pipeline. In case you prefer reading a documentation, rather than notebooks or code, see the project page under [Documentation].

This repo inlcudes:

  1. genQC/ a full release of our used diffusion pipeline.
  2. src/examples/ examples and tutorials to show how to use the library.
  3. src/ the source notebooks for nbdev.

Examples

Minimal example

A minimal example to compile the 4-qubit Quantum Fourier transform (QFT) unitary, using parameterized circuits.

import torch
from genQC.pipeline.multimodal_diffusion_pipeline import MultimodalDiffusionPipeline_ParametrizedCompilation
from genQC.inference.sampling import generate_compilation_tensors, decode_tensors_to_backend
from genQC.utils.misc_utils import infer_torch_device, set_seed
from genQC.platform.tokenizer.circuits_tokenizer import CircuitTokenizer
from genQC.benchmark.bench_compilation import SpecialUnitaries
from genQC.platform.simulation import Simulator, CircuitBackendType

device = infer_torch_device()

pipeline = MultimodalDiffusionPipeline_ParametrizedCompilation.from_pretrained(
                                repo_id="Floki00/cirdit_multimodal_compile_3to5qubit_v1.1", 
                                device=device)

pipeline.scheduler.set_timesteps(40) 
pipeline.scheduler_w.set_timesteps(40) 

pipeline.g_h, pipeline.g_w = 0.3, 0.1
pipeline.lambda_h, pipeline.lambda_w = 1.0, 0.35

U = SpecialUnitaries.QFT(num_qubits=4).to(torch.complex64)

out_tensor, params = generate_compilation_tensors(pipeline, 
                          prompt="Compile 4 qubits using: ['h', 'cx', 'ccx', 'swap', 'rx', 'ry', 'rz', 'cp']", 
                          U=U, 
                          samples=8, 
                          system_size=5, 
                          num_of_qubits=4, 
                          max_gates=32)
vocabulary = {g:i+1 for i, g in enumerate(pipeline.gate_pool)}
tokenizer  = CircuitTokenizer(vocabulary)
simulator  = Simulator(CircuitBackendType.CUDAQ)

qc_list, _ = decode_tensors_to_backend(simulator, tokenizer, out_tensor, params)

simulator.backend.draw(qc_list[0], num_qubits=4)
                                                                        ยป
q0 : โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ—โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ—โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ—โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ณโ”€ยป
                             โ”‚         โ•ญโ”€โ”€โ”€โ•ฎ      โ”‚      โ•ญโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฎ โ”‚ ยป
q1 : โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ณโ”€โ”ค h โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ค r1(1.25) โ”œโ”€โ”ผโ”€ยป
                             โ”‚       โ”‚ โ•ฐโ”€โ”€โ”€โ•ฏโ•ญโ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฎโ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ โ”‚ ยป
q2 : โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ—โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ณโ”€โ”€โ”€โ”€โ”€โ”€โ”ค r1(6.253) โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€ยป
     โ•ญโ”€โ”€โ”€โ•ฎโ•ญโ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฎโ•ญโ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฎ        โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ             โ”‚ ยป
q3 : โ”ค h โ”œโ”ค r1(1.571) โ”œโ”ค r1(7.191) โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ณโ”€ยป
     โ•ฐโ”€โ”€โ”€โ•ฏโ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏโ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ                                    ยป

################################################################################

                 
โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
                 
โ”€โ”€โ”€โ”€โ”€โ—โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
โ•ญโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฎโ•ญโ”€โ”€โ”€โ•ฎ
โ”ค r1(1.59) โ”œโ”ค h โ”œ
โ•ฐโ”€โ”€โ”ฌโ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ•ฏโ•ฐโ”€โ”€โ”€โ•ฏ
โ”€โ”€โ”€โ”ค h โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
   โ•ฐโ”€โ”€โ”€โ•ฏ         

Further examples

A detailed tutorial on the application with CUDA-Q is available on the CUDA-Q documentation.

More examples and tutorial notebooks are provided on the project page [tutorials] or in the directory src/examples/.

Installation

The installation of genQC is done via pip within a few minutes, depending on your downloading speed.

Method 1: pip install

To install genQC just run:

pip install genQC

Note, this will install missing requirements automatically. You may want to install some of them manually beforehand, e.g.ย torch for specific cuda support, see https://pytorch.org/get-started/locally/.

Requirements: genQC depends on python (min. version 3.12) and the libraries: torch, numpy, matplotlib, scipy, omegaconf, qiskit, tqdm, joblib, open_clip_torch, ipywidgets, pylatexenc, safetensors, tensordict and huggingface_hub. All can be installed with pip install. In src/RELEASES.md [doc] and the GitHub release descriptions, specific tested-on versions are listed.

Method 2: clone the repository

To use the latest GitHub code, you can clone the repository by running:

git clone https://github.com/FlorianFuerrutter/genQC.git
cd genQC

The library genQC is built using jupyter notebooks and nbdev. To install the library use in the clone directory:

pip install -e .

Test installation

You can run the provided src/examples/Quantum circuit synthesis with diffusion models/0_hello_circuit [doc] [notebook] example to test your installation. On a computer with a moderate GPU this inference example notebook should run under half a minute.

License

The code and weights in this repository are licensed under the Apache License 2.0.

BibTeX

We kindly ask you to cite our paper if any of the previous material was useful for your work.

Quantum circuit synthesis with diffusion models

@article{furrutter2024quantum,
  title={Quantum circuit synthesis with diffusion models},
  author={F{\"u}rrutter, Florian and Mu{\~n}oz-Gil, Gorka and Briegel, Hans J},
  journal={Nature Machine Intelligence},
  doi = {https://doi.org/10.1038/s42256-024-00831-9},
  vol = {6},
  pages = {515-โ€“524},
  pages={1--10},
  year={2024},
  publisher={Nature Publishing Group UK London}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

genqc-0.2.4.tar.gz (107.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genqc-0.2.4-py3-none-any.whl (129.6 kB view details)

Uploaded Python 3

File details

Details for the file genqc-0.2.4.tar.gz.

File metadata

  • Download URL: genqc-0.2.4.tar.gz
  • Upload date:
  • Size: 107.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for genqc-0.2.4.tar.gz
Algorithm Hash digest
SHA256 69aedcb533aeba62c0447ad1d20c83711e85ac22121aecfe51dccd4ce2afce61
MD5 7e69b56d45a4dc7cc83cf5a1bb6a903e
BLAKE2b-256 5f23dcaec006138ec56d69eb8ea9579b7ebda22a0cd7422bf50656f568e4c8b4

See more details on using hashes here.

File details

Details for the file genqc-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: genqc-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 129.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for genqc-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 af2588ed64e7e379946a7ae4ba6b0f525359e57408f25efb28edc668f89e85dc
MD5 96d0dde46816375ee60c5bc240a2da5b
BLAKE2b-256 6703d7e6b0f34dfa0b37430060097fad248beaabda79a12a84a611e6f2df2c88

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page