Skip to main content

PoPE

Project description

PoPE-pytorch (wip)

Efficient implementation (and explorations) into polar coordinate positional embedding (PoPE) - from Gopalakrishnan et al. under Schmidhuber

Install

$ pip install PoPE-pytorch

Usage

import torch
from PoPE_pytorch import PoPE

# define pope

pope = PoPE(64, heads = 8)

# pass in sequence length

pos_embed = pope(1024)

# queries and keys in attention

q = torch.randn(1, 8, 1024, 64)
k = torch.randn(1, 8, 1024, 64)

# training

rotated_q, rotated_k = pope.apply_pope_to_qk(pos_embed, q, k)

# inference

rotated_q, rotated_k = pope.apply_pope_to_qk(pos_embed, q[..., -1:, :], k)

Fused Attention Similarity

import torch
from PoPE_pytorch import PoPE, compute_attn_similarity

# define pope

pope = PoPE(dim = 64, heads = 8).cuda()

# get rotations

pos_emb = pope(1024)

# queries and keys

q = torch.randn(1, 8, 1024, 64).cuda()
k = torch.randn(1, 8, 1024, 64).cuda()

# fused attention similarity, avoiding expanding 64 to 128

sim = compute_attn_similarity(q, k, pos_emb) # (1, 8, 1024, 1024)

attn = sim.softmax(dim = -1) # the usual in attention..

Fused Flash Attention

import torch
from PoPE_pytorch import PoPE, flash_attn_with_pope

# pope

pope = PoPE(dim = 32, heads = 8).cuda()

# queries, keys, values for attention

q = torch.randn(1, 8, 1024, 64).cuda()
k = torch.randn(1, 8, 1024, 64).cuda()
v = torch.randn(1, 8, 1024, 64).cuda()

pope_emb = pope(1024)

out = flash_attn_with_pope(q, k, v, pope = pope_emb, causal = True)

assert out.shape == (1, 8, 1024, 64)

Citations

@misc{gopalakrishnan2025decouplingwhatwherepolar,
    title   = {Decoupling the "What" and "Where" With Polar Coordinate Positional Embeddings}, 
    author  = {Anand Gopalakrishnan and Robert Csordás and Jürgen Schmidhuber and Michael C. Mozer},
    year    = {2025},
    eprint  = {2509.10534},
    archivePrefix = {arXiv},
    primaryClass = {cs.LG},
    url     = {https://arxiv.org/abs/2509.10534}, 
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pope_pytorch-0.0.12.tar.gz (213.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pope_pytorch-0.0.12-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file pope_pytorch-0.0.12.tar.gz.

File metadata

  • Download URL: pope_pytorch-0.0.12.tar.gz
  • Upload date:
  • Size: 213.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for pope_pytorch-0.0.12.tar.gz
Algorithm Hash digest
SHA256 76278884f88b031981251c15d98a491514ee8ef470adde5925af09f2aa4a9d07
MD5 a183e16dff87b16be69d171d32974308
BLAKE2b-256 f1139b6a209b190ce6249634b8b3da5b748d6f4944dda39f928ca33fbc4bbd1c

See more details on using hashes here.

File details

Details for the file pope_pytorch-0.0.12-py3-none-any.whl.

File metadata

  • Download URL: pope_pytorch-0.0.12-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for pope_pytorch-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 b1fc8a97bd1039bb3b74b7e42b4cb21b93339e745170c8b1a8b160f2aaf74440
MD5 c62cded93248909077f8ca07d43d3650
BLAKE2b-256 ce85124523d10b31ab0830ab1594f9d587035dbacf639006608d9c7bf6bef025

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page