Skip to main content

Floyd Multi-Head Attention: a drop-in variant of PyTorch MHA with module and function APIs

Project description

FloydNet

License Python PyTorch

Official implementation of FloydNet.

Figure Pivotal Attention Mechanism for 2-Floyd/3-Floyd.

This repository serves two audiences:

  • Engineering users: Reusable PyTorch components (functional attention APIs and Transformer-style blocks) under src/.
  • Research users: Scripts/configs to reproduce paper experiments (TSP, Graph Isomorphism, BREC) under example/.

Introduction

FloydNet is the official PyTorch implementation. The repository provides:

  1. Reusable components: a drop-in attention/Transformer-block interface intended for integration into existing projects.
  2. Reproduction code: end-to-end training/evaluation pipelines to reproduce the benchmarks reported in the paper.

For algorithmic details, hyperparameter choices, and analysis, please refer to the paper.


Repository Structure

  • src/floydnet/
    Library code for reuse
    Contains the functional attention API and module/block implementations.

  • example/
    Experiment reproduction code
    Includes benchmark-specific scripts, configs, and data preparation utilities.


Installation

Option A: Install from PyPI

pip install floydnet

Option B: Install from source

git clone git@github.com:ocx-lab/FloydNet.git
cd FloydNet
pip install -e .

Requirements: Python >= 3.9, PyTorch >= 2.1 (see pyproject.toml).

Public API

FloydNet re-exports the public API from src/floydnet/__init__.py, so you can import from the top-level package:

  • Functional API:
    • pivotal_attention (see src/floydnet/functional.py)
  • Module / block API:
    • PivotalAttentionBlock (see src/floydnet/transformer.py)
from floydnet import pivotal_attention, PivotalAttentionBlock

Minimal usage example

import torch
from floydnet import pivotal_attention, PivotalAttentionBlock

# -------------------------
# Module API (Transformer-style block)
# Input is a 2D grid: (B, N, N, C)
# -------------------------
B, N, C = 2, 16, 64
x = torch.randn(B, N, N, C)

m = PivotalAttentionBlock(embed_dim=C, num_heads=8, dropout=0.0)
out = m(x)  # (B, N, N, C)
print(out.shape)

# -------------------------
# Functional API
# All inputs are 5D: (B, H, N, N, D)
# -------------------------
B, H, N, D = 2, 8, 16, 64
q_ik = torch.randn(B, H, N, N, D)
k_ij = torch.randn(B, H, N, N, D)
k_jk = torch.randn(B, H, N, N, D)
v_ij = torch.randn(B, H, N, N, D)
v_jk = torch.randn(B, H, N, N, D)

y = pivotal_attention(q_ik, k_ij, k_jk, v_ij, v_jk)  # (B, H, N, N, D)
print(y.shape)

Reproducing Paper Results

This section targets research users who want to reproduce the experiments in the paper.

See example/README.md For detailed description.

Environment setup

We recommend using uv to create an isolated environment for the reproduction code under example/.

cd /path/to/FloydNet

# 1) Create a uv virtual environment with Python 3.12
uv venv --python 3.12

# 2) Activate it
source .venv/bin/activate

# 3) Install extra dependencies for reproducing paper experiments
uv pip install -r example/requirements.txt

# 4) Install FloydNet (editable) for local development / imports
uv pip install -e .

Changelog (latest)

  • Added softmax_cap parameter to pivotal_attention3 for improved numerical stability.
  • Added LRGB example script.

The full changelog is in CHANGELOG.md.

Citation

If you use this code in your research, please cite the paper:

@misc{yu2026floydnetlearningparadigmglobal,
      title={FloydNet: A Learning Paradigm for Global Relational Reasoning}, 
      author={Jingcheng Yu and Mingliang Zeng and Qiwei Ye},
      year={2026},
      eprint={2601.19094},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2601.19094}, 

}

(Alternatively, see CITATION.cff.)


License

This project is licensed under the Apache License 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

floydnet-1.1.0.tar.gz (15.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

floydnet-1.1.0-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file floydnet-1.1.0.tar.gz.

File metadata

  • Download URL: floydnet-1.1.0.tar.gz
  • Upload date:
  • Size: 15.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.30 {"installer":{"name":"uv","version":"0.9.30","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for floydnet-1.1.0.tar.gz
Algorithm Hash digest
SHA256 ae94f0686928972e144af9e908b7899ba64e8f3e58e13faedabc85c0e2e31860
MD5 5354677bee0383cd822367f95a69014a
BLAKE2b-256 8bf740034d520728cced82e5db3a6cfd179b73b8d2eb705519ea80b1a2c9183b

See more details on using hashes here.

File details

Details for the file floydnet-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: floydnet-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.30 {"installer":{"name":"uv","version":"0.9.30","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for floydnet-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b8dbc022fc1456dd39587dcafde7dc57289a0cfce77db03655f5433dff86be99
MD5 d518a1de97a1e645ac5ad9bcd5d4f6ea
BLAKE2b-256 077990170d6dd2896cd861595b2039235d56896d1965562a3a11ca6131d66996

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page