Skip to main content

PoPE

Project description

PoPE-pytorch (wip)

Efficient implementation (and explorations) into polar coordinate positional embedding (PoPE) - from Gopalakrishnan et al. under Schmidhuber

Install

$ pip install PoPE-pytorch

Usage

import torch
from PoPE_pytorch import PoPE

# define pope

pope = PoPE(64, heads = 8)

# pass in sequence length

pos_embed = pope(1024)

# queries and keys in attention

q = torch.randn(1, 8, 1024, 64)
k = torch.randn(1, 8, 1024, 64)

# training

rotated_q, rotated_k = pope.apply_pope_to_qk(pos_embed, q, k)

# inference

rotated_q, rotated_k = pope.apply_pope_to_qk(pos_embed, q[..., -1:, :], k)

Fused Attention Similarity

import torch
from PoPE_pytorch import PoPE, compute_attn_similarity

# define pope

pope = PoPE(dim = 64, heads = 8).cuda()

# get rotations

pos_emb = pope(1024)

# queries and keys

q = torch.randn(1, 8, 1024, 64).cuda()
k = torch.randn(1, 8, 1024, 64).cuda()

# fused attention similarity, avoiding expanding 64 to 128

sim = compute_attn_similarity(q, k, pos_emb) # (1, 8, 1024, 1024)

attn = sim.softmax(dim = -1) # the usual in attention..

Fused Flash Attention

import torch
from PoPE_pytorch import PoPE, flash_attn_with_pope

# pope

pope = PoPE(dim = 32, heads = 8).cuda()

# queries, keys, values for attention

q = torch.randn(1, 8, 1024, 64).cuda()
k = torch.randn(1, 8, 1024, 64).cuda()
v = torch.randn(1, 8, 1024, 64).cuda()

pope_emb = pope(1024)

out = flash_attn_with_pope(q, k, v, pope = pope_emb, causal = True)

assert out.shape == (1, 8, 1024, 64)

Citations

@misc{gopalakrishnan2025decouplingwhatwherepolar,
    title   = {Decoupling the "What" and "Where" With Polar Coordinate Positional Embeddings}, 
    author  = {Anand Gopalakrishnan and Robert Csordás and Jürgen Schmidhuber and Michael C. Mozer},
    year    = {2025},
    eprint  = {2509.10534},
    archivePrefix = {arXiv},
    primaryClass = {cs.LG},
    url     = {https://arxiv.org/abs/2509.10534}, 
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pope_pytorch-0.0.11.tar.gz (214.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pope_pytorch-0.0.11-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file pope_pytorch-0.0.11.tar.gz.

File metadata

  • Download URL: pope_pytorch-0.0.11.tar.gz
  • Upload date:
  • Size: 214.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for pope_pytorch-0.0.11.tar.gz
Algorithm Hash digest
SHA256 2b0f6cb7e201030186f8f66c05edd87dbbedbee5dbdb7ffc59f5e1fd349f0081
MD5 c91a6bc083c2ece1d82ee55b65c3938a
BLAKE2b-256 72d7423f7e69bb3f50e89ff34d4fb7cb8a4bcff23cfa50eaf98f6d151196c2fc

See more details on using hashes here.

File details

Details for the file pope_pytorch-0.0.11-py3-none-any.whl.

File metadata

  • Download URL: pope_pytorch-0.0.11-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for pope_pytorch-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 f87074d907f078b37cbb99cc13b60e9fb5da2a028b227a01fa7f510f20157860
MD5 3228b0e02b88de636b842c763247e228
BLAKE2b-256 ebbf7f626a9f32c2eac4645d4b57a5d568cb3961f80397073ea1b5bfae512dda

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page