Skip to main content

SuperGrad differentiable Hamiltonian simulator

Project description

SuperGrad: Differentiable Simulator for superconducting quantum processors

GitHub actions Docs PyPI License

SuperGrad is an open-source simulator designed to accelerate the development of superconducting quantum processors by incorporating gradient computation capabilities.

Notice: This package is currently in its early development stages. Please be aware that breaking changes to the API may occur.

Why SuperGrad?

Superconducting processors offer significant design flexibility, including various types of qubits and interactions. With the large number of tunable parameters in a processor, gradient optimization becomes crucial. SuperGrad fills the gap in open-source software by providing a tightly integrated library capable of efficient backpropagation for gradient computation.

Key Features

  • Efficient backpropagation for gradient computation based on JAX.
  • User-friendly interface for constructing Hamiltonians
  • Computation of both static and dynamic properties of composite systems, such as the unitary matrices of simultaneous gates

These features help us to speedup tasks including

  • Optimal control
  • Design optimization
  • Experimental data fitting

Installation

We suggest using python version >= 3.9.

pip install supergrad

Examples

Typical workflow

First, one need to define an interaction graph which describes qubits and their connectivity. This is done with creating an instance of Networkx.Graph class. There is a work-in-progress GUI for creating such graphs available at https://github.com/iqubit-org/supergrad-gui.

We consider Hamiltonians of the form

H_{\mathrm{idle}}\left(\vec{h}\right)+\sum_{k=1}^{N_c}f_{k}\left(\vec{c},t\right)C_{k}\left(\vec{h}\right)

The parameters about only a single qubit are stored in the nodes of the graph. These include parameters of superconducting qubits such as $E_C$, $E_J$, and parameters $\vec{c}$ of control pulses. The couplings of qubits are stored in the edges between qubits, which is a subset of $\vec{h}$. Then, some Helper classes will parse the graph, and create functions $f(\vec{h},\vec{c})$ which compute time-evolution unitary $U(\vec{h},\vec{c})$ or the energy spectrum of $H_{\mathrm{idle}}(\vec{h})$. One can construct objective functions based on these results. JAX can then be used to compute the gradient of an objective function and use it to run gradient optimization.

In general, we will use GHz and ns as units for energy and time parameters.

Fluxonium with multi-path coupling (Main example)

The jupyter notebook

This example is based on Nguyen, L. B. et al. Blueprint for a High-Performance Fluxonium Quantum Processor. PRX Quantum 3, 037001 (2022). We simulate a 6 Fluxonium qubit system from an underlying periodic lattice. Idling hamiltonian of the system is

H(0) = \sum_i H_{\mathrm{f},i}  + \sum_{\langle i,j \rangle } H_{ij}

Hamiltonian of single Fluxonium is

H_{\mathrm{f},i} = 4E_{\mathrm{C},i} n_i^2 + \frac{1}{2} E_{\mathrm{L},i} (\varphi_i +\varphi_{\mathrm{ext},i})^2
    -E_{\mathrm{J},i}\cos \left( \varphi_i \right)

The coupling terms have the form

H_{ij} = J_{\mathrm{C}} n_i n_j - J_{\mathrm{L}} \varphi_i \varphi_j

The couplings are chosen in a way such that the idling $ZZ$-crosstalk is almost zero. We compute the time-evolution and the Pauli error rates for simultaneous single-qubit X gates and two-qubit CR gates. More details can be found in Ni, X. et al. Superconducting processor design optimization for quantum error correction performance. arXiv:2312.04186.

Transmon with tunable couplers

The jupyter notebook

This example is based on Xu, Y. et al. High-Fidelity, High-Scalability Two-Qubit Gate Scheme for Superconducting Qubits. Phys. Rev. Lett. 125, 240503 (2020). We simulate a 5 transmon qubit system, where 3 of them are computational qubits and the other 2 are the couplers. We compute the time-evolution and the Pauli error rates for simultaneous single-qubit X gates.

Fluxonium parameter fitting from experimental spectrum data

The jupyter notebook

This is a quite different application compared to above ones. Here we try to infer the parameters of the system from spectrum data from experiments. We will consider the simplest case which is fitting the parameters of one Fluxonium. But the procedure can be applied to more complex systems as well.

Citation

If this project is helpful to you in your research, the use of SuperGrad in research publications is appropriately acknowledged by citing:

@misc{supergrad_2024,
      title={SuperGrad: a differentiable simulator for superconducting processors},
      author={Ziang Wang and Feng Wu and Hui-Hai Zhao and Xin Wan and Xiaotong Ni},
      year={2024},
      eprint={2406.18155},
      archivePrefix={arXiv},
      primaryClass={quant-ph}
}

which is also a good introduction to the simulator.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

supergrad-0.2.1.tar.gz (98.7 kB view details)

Uploaded Source

Built Distribution

supergrad-0.2.1-py3-none-any.whl (105.7 kB view details)

Uploaded Python 3

File details

Details for the file supergrad-0.2.1.tar.gz.

File metadata

  • Download URL: supergrad-0.2.1.tar.gz
  • Upload date:
  • Size: 98.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for supergrad-0.2.1.tar.gz
Algorithm Hash digest
SHA256 70db37d97445280358d4011471fed65910b2380ee15fed08970c712a813c4316
MD5 b6450c925dea264d13180ff2cae367d2
BLAKE2b-256 e935249d00e2446cd42fb85f3abad18d70ad37ed72b9687b9bff636016318a8b

See more details on using hashes here.

File details

Details for the file supergrad-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: supergrad-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 105.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for supergrad-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a5f3d79fc3de0a385fedcbb858361497688e008241d23863614e4aeb2d14a23f
MD5 3253226266b8bdf5eed0c313c48d1181
BLAKE2b-256 10430e158780e5355c1318e6bb938984f60907bed994ffc837d64b8ce402e8ef

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page