Skip to main content

Hypergraph Minimum-Weight Parity Factor (MWPF) Solver for Quantum LDPC Codes

Project description

MWPF

Hypergraph Minimum-Weight Parity Factor Decoder for QEC

Preview claim: We publish the Python package that may subject to breaking changes. The source code and the full version will be made publicly available with our coming paper.

Hypergraph MWPF is proven to be NP-hard [1]. Our design is taking advantage of clustering technique to lower the average time complexity and reach almost-linear average time complexity at small physical error rate. Please wait for our paper for more discussion of the speed v.s. accuracy.

Color Code Example (click for YouTube video)

Installation

pip install MWPF

Background

The Most-Likely Error (MLE) decoding problem can be formulated as a MWPF problem.

Naming

We named it Minimum-Weight Parity Factor because of a concept called "parity $(g,f)$-factor" in Lovász's 1972 paper [2]. In the context of QEC, the functions $g(v)$ and $f(v)$ associates with the measured syndrome as shown above. Without ambiguity, we just call it "parity factor".

Relationship with MWPM decoder

Minimum-Weight Perfect Matching decoder is the traditional decoder for QEC. When the decoding graph is a simple graph, the MLE decoding problem can be reduced to an MWPM problem on a generated complete graph with $O(|V|^2)$ edges. Two recent works [2] and [3] improves the average time complexity to almost theoretical upper bound of $O(|V|)$ by not explicitly generating the complete graph. This motivates our idea to design an algorithm directly on the model graph, hoping to reach the same $O(|V|)$ average time complexity even if the model graph is hypergraph.

Key Idea

We try to extend the blossom algorithm that solves the MWPM problem on simple graphs. An interesting attribute of the blossom algorithm is that it deals with an exponential number of linear constraints in order to relax the integer constraints. The added linear constraints, which we refer to as "blossom constraints", are based on a very simple idea: filtering out invalid solutions. The blossom constraints simply say "any odd cluster cannot be perfectly matched internally", which is obviously true. However, this "filtering" requires an exponential number of linear constraints [5], which is impossible to generate efficiently. Thus, the algorithm must implicitly consider those exponential number of constraints without ever listing them. In fact, the blossom algorithm only keeps track of $O(|V|)$ such constraints from the exponentially many. Surprisingly, this works! Inspired by this miracle, we now have the courage to use an exponential number of linear constraints to solve MWPF.

Rigorous Math

The ILP problem above is very similar to that of the blossom algorithm, except for the more complex "blossom"definition: it's now a subgraph $S=(V_S, E_S), V_S \subseteq V, E_S \subseteq E$ instead of just a subset of vertices $S\subseteq V$ in the blossom algorithm. We have mathematically proved the equivalence between hypergraph MWPF and this ILP. In the next step, we simply relax the integer constraints.

Clearly, as a relaxation, the minimum objective value is no larger than that of the ILP. Unfortunately, we haven't been able to rigorously prove that they have the same optimal objective value, nor can we find a counter example. We leave this conjecture as is for now, and do not assume its correctness.

Conjecture: $\min\text{ILP}=\min\text{LP}$.

The dual problem is a transpose of the primal LP problem. According to the duality theorem, they have the same optimal value. The key is that each primal constraint becomes a dual variable, and each primal variable becomes a dual constraint. Clearly, for the dual problem, $y_S = 0, \forall S \in \mathcal{O}$ is a solution (despite usually not the optimal solution). In this way, we can keep track of only non-zero dual variables to implicitly considering all the exponentially many primal constraints. Since the dual LP problem becomes a maximization problem, we have the whole inequality chain as below.

If we can find a pair of feasible primal and dual solutions with the same weight, then this inequality chain collapses to an equality chain and the primal solution is proven to be optimal. Even if they are not equal, we still get a proximity bound.

In fact, MWPM decoder using the blossom algorithm fits in this framework, where the alternating tree operation guarantees that in the end the primal must be equal to the dual. Union-Find decoder and hypergraph UF decoder can also be explained by this framework, but the proximity bound is usually not singleton.

Usage

The decoding process is two steps (shown in Background)

  1. offline: construct the model graph given the QEC code and the noise model
  2. online: compute the most-likely error $\mathcal{E}\subseteq E$ given the syndrome (represented by the defect vertices $D \subseteq V$) and some dynamic weights
from mwpf import HyperEdge, SolverInitializer, SolverSerialJointSingleHair, SyndromePattern

# Offline
vertex_num = 4
weighted_edges = [
    HyperEdge([0, 1], 100),  # [vertices], weight
    HyperEdge([1, 2], 100),
    HyperEdge([2, 3], 100),
    HyperEdge([0], 100),  # boundary vertex
    HyperEdge([0, 1, 2], 60),  # hyperedge
]
initializer = SolverInitializer(vertex_num, weighted_edges)
hyperion = SolverSerialJointSingleHair(initializer)

# Online
syndrome = [0, 1, 3]
hyperion.solve(SyndromePattern(syndrome))
hyperion_subgraph = hyperion.subgraph()
print(hyperion_subgraph)  # out: [3, 5], weighted 160
_, bound = hyperion.subgraph_range()
print((bound.lower, bound.upper))  # out: (Fraction(160, 1), Fraction(160, 1))

The example hyergraph is below: grey edges are weighted 100 and the purple hyperedge is weighted 60.

In constrast, if we were to solve MWPF with MWPM decoder, then we have to ignore the hyperedge $e_4$ and thus leads to suboptimal result, as shown by the following Python script using the Fusion Blossom library.

from fusion_blossom import SolverInitializer, SolverSerial, SyndromePattern

# Offline
vertex_num = 5
weighted_edges = [(0, 4, 100), (0, 1, 100), (1, 2, 100), (2, 3, 100)]
virtual_vertices = [4]
initializer = SolverInitializer(vertex_num, weighted_edges, virtual_vertices)
fusion = SolverSerial(initializer)

# Online
syndrome = [0, 1, 3]
fusion.solve(SyndromePattern(syndrome))
fusion_subgraph = fusion.subgraph()
print(fusion_subgraph)  # out: [0, 2, 3], which is weighted 300 instead of 160

Advanced Usage

When trading off accuracy and decoding time, we provide a timeout parameter for the decoder. Also, one can specify whether you want the clusters to all grow together or grow one by one. More parameters are coming as we develop the library.

config = {
    "primal": {
        "timeout": 3.0,  # 3 second timeout for each cluster
    },
    "growing_strategy": "SingleCluster",  # growing from each defect one by one
    # "growing_strategy": "MultipleClusters",  # every defect starts to grow at the same time
}
hyperion = SolverSerialJointSingleHair(initializer, config)

An embedded visualization tool is coming soon.

Examples

For surface code with depolarizing noise mode $p_x =p_y=p_z = p/3$, here shows physical error rates 1%, 2% and 4% (left to right).

Surface Code Example (click for YouTube video)

For triangle color code with X errors, here shows physical error rates 1%, 2% and 4% (left to right).

Triangle Color Code Example (click for YouTube video)

For circuit-level surface code, here shows physical error rate 0.03%, 0.1% and 0.3% (left to right).

Circuit-level Surface Code Example (click for YouTube video)

Reference

[1] Berlekamp, Elwyn, Robert McEliece, and Henk Van Tilborg. "On the inherent intractability of certain coding problems (corresp.)." IEEE Transactions on Information Theory 24.3 (1978): 384-386.

[2] Lovász, László. "The factorization of graphs. II." Acta Mathematica Academiae Scientiarum Hungarica 23 (1972): 223-246.

[3] Higgott, Oscar, and Craig Gidney. "Sparse Blossom: correcting a million errors per core second with minimum-weight matching." arXiv preprint arXiv:2303.15933 (2023).

[4] Wu, Yue, and Lin Zhong. "Fusion Blossom: Fast MWPM Decoders for QEC." arXiv preprint arXiv:2305.08307 (2023).

[5] Rothvoß, Thomas. "The matching polytope has exponential extension complexity." Journal of the ACM (JACM) 64.6 (2017): 1-19.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

mwpf-0.0.4-cp37-abi3-win_amd64.whl (695.0 kB view details)

Uploaded CPython 3.7+ Windows x86-64

mwpf-0.0.4-cp37-abi3-win32.whl (650.7 kB view details)

Uploaded CPython 3.7+ Windows x86

mwpf-0.0.4-cp37-abi3-musllinux_1_1_x86_64.whl (13.2 MB view details)

Uploaded CPython 3.7+ musllinux: musl 1.1+ x86-64

mwpf-0.0.4-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.1 MB view details)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

mwpf-0.0.4-cp37-abi3-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl (1.9 MB view details)

Uploaded CPython 3.7+ macOS 10.9+ universal2 (ARM64, x86-64) macOS 10.9+ x86-64 macOS 11.0+ ARM64

File details

Details for the file mwpf-0.0.4-cp37-abi3-win_amd64.whl.

File metadata

  • Download URL: mwpf-0.0.4-cp37-abi3-win_amd64.whl
  • Upload date:
  • Size: 695.0 kB
  • Tags: CPython 3.7+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.6

File hashes

Hashes for mwpf-0.0.4-cp37-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 c9d0c07569bbd7270816b9cb6e215507d25c01f4981347c355eea8c805408a90
MD5 84a9fe6de416fd08fd1d9c52976e3e4e
BLAKE2b-256 e1b8a21971dec53e6585f9c28a1db2e8d397f1c39edcc3d1376535c60d155c8c

See more details on using hashes here.

File details

Details for the file mwpf-0.0.4-cp37-abi3-win32.whl.

File metadata

  • Download URL: mwpf-0.0.4-cp37-abi3-win32.whl
  • Upload date:
  • Size: 650.7 kB
  • Tags: CPython 3.7+, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.6

File hashes

Hashes for mwpf-0.0.4-cp37-abi3-win32.whl
Algorithm Hash digest
SHA256 bcf73925089bafaece7afd9489640f67c1bccd0b68cb4f7b414db7bb08db8af3
MD5 9806e5bb89d6d251f614778ec341fb82
BLAKE2b-256 162deb0d855b9c198dd5f6d612a8f63f32a6203470fe009a9ceea6727de1a6e8

See more details on using hashes here.

File details

Details for the file mwpf-0.0.4-cp37-abi3-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mwpf-0.0.4-cp37-abi3-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 bf5a015bc5eb8ee1c6e0e764d23ec56a75693ec926825899d8f6f81106d63761
MD5 0a9bfca1b55bad548efe539c4a52dc0d
BLAKE2b-256 bd87c5bdce33799baf41b3fecb9ba7c4bdc09c2100669e68c69b9ca949990b01

See more details on using hashes here.

File details

Details for the file mwpf-0.0.4-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mwpf-0.0.4-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f74404a27c1af84f07581cf90da4202dc6d6d24f0a9b7d31ddbfe3ed2ef13000
MD5 bf629af0e6993e89f55acc8fd0edb567
BLAKE2b-256 f60a22fbd515d3774c5971f30e9e7e8a7ca57290e197e62fd920f4c011515c6a

See more details on using hashes here.

File details

Details for the file mwpf-0.0.4-cp37-abi3-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for mwpf-0.0.4-cp37-abi3-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 af44fdf76835379e8d84e999b91e9926a640c61d146364c08111c41294b010cc
MD5 e8ac84af28d96d9ef4add07d4487adae
BLAKE2b-256 fa11bce2fc0f0d28c43fd0cfd7157721e6e3e8ea098f41ff45189fbebe1acf52

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page