Skip to main content

Lookahead Keys Attention

Project description

Lookahead Keys Attention (wip)

Causal Attention with Lookahead Keys

Installation

pip install lookahead-keys-attention

Usage

import torch
from lookahead_keys_attention import Castle

# Initialize the Castle attention module
model = Castle(
    dim=512,           # input dimension
    heads=8,           # number of attention heads
    dim_head=64,       # dimension per head
    use_triton=None    # auto-detect CUDA for Triton optimization
)

# Example with CUDA sequence
batch_size = 2
seq_len = 128
dim = 512

# Move to CUDA if available
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
model = model.to(device)

# Input sequence
x = torch.randn(batch_size, seq_len, dim).to(device)

# Forward pass
output = model(x)  # Shape: [batch_size, seq_len, dim]

# For inference with caching (single token generation)
cache = None
for i in range(seq_len):
    token = x[:, i:i+1, :]  # Single token
    output, cache = model(token, cache=cache, return_next_cache=True)

Citations

@inproceedings{Song2025CausalAW,
    title   = {Causal Attention with Lookahead Keys},
    author  = {Zhuoqing Song and Peng Sun and Huizhuo Yuan and Quanquan Gu},
    year    = {2025},
    url     = {https://api.semanticscholar.org/CorpusID:281218151}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lookahead_keys_attention-0.1.1.tar.gz (154.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lookahead_keys_attention-0.1.1-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file lookahead_keys_attention-0.1.1.tar.gz.

File metadata

  • Download URL: lookahead_keys_attention-0.1.1.tar.gz
  • Upload date:
  • Size: 154.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.23

File hashes

Hashes for lookahead_keys_attention-0.1.1.tar.gz
Algorithm Hash digest
SHA256 5be04d91f135d86d2b00901b82c93e8a168894d2097ccf71b745fa2b880f3b13
MD5 3aea98e6f03c4fbdc70bb1975729131e
BLAKE2b-256 4349c79b85da900706f48e3665128ee6d2d50cd8889440e96055c0088322b202

See more details on using hashes here.

File details

Details for the file lookahead_keys_attention-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for lookahead_keys_attention-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 24ee0b00b3c986fb76db7bcfd746b755a9b6da4f6ff45d9c048431d4c34d7087
MD5 96f960bb161d1ae31f7b31fe1f6e7c1f
BLAKE2b-256 0add287a34a3c28c3feb3c36eec5f11cb4344e67a6ebefd4b96ab13099bf6e64

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page