Titans
Project description
Titans - Pytorch
Unofficial implementation of Titans in Pytorch. Will also contain some explorations into architectures beyond their simple 1-4 layer MLP for the neural memory module, if it works well to any degree.
Appreciation
- Eryk for sharing his early experimental results with me, positive for 2 layer MLP
Install
$ pip install titans-pytorch
Usage
import torch
from titans_pytorch import NeuralMemory
mem = NeuralMemory(
dim = 384,
chunk_size = 64 # set to smaller chunk size for better perf on smaller sequence lengths (but more memory usage)
).cuda()
seq = torch.randn(2, 1024, 384).cuda()
retrieved, mem_state = mem(seq)
assert seq.shape == retrieved.shape
A transformer with the MAC
configuration can be used as
import torch
from titans_pytorch import MemoryAsContextTransformer
transformer = MemoryAsContextTransformer(
num_tokens = 256,
dim = 256,
depth = 2,
segment_len = 128, # local attention window size
num_persist_mem_tokens = 4,
num_longterm_mem_tokens = 16,
)
token_ids = torch.randint(0, 256, (1, 1023))
loss = transformer(token_ids, return_loss = True) # (1, 1023, 256)
loss.backward()
# after much training
sampled = transformer.sample(token_ids[:, :4], 512)
Experiments
$ pip install .[examples]
Then modify train_mac.py
and run it to query nature
$ python train_mac.py
Citations
@inproceedings{Behrouz2024TitansLT,
title = {Titans: Learning to Memorize at Test Time},
author = {Ali Behrouz and Peilin Zhong and Vahab S. Mirrokni},
year = {2024},
url = {https://api.semanticscholar.org/CorpusID:275212078}
}
@article{Sun2024LearningT,
title = {Learning to (Learn at Test Time): RNNs with Expressive Hidden States},
author = {Yu Sun and Xinhao Li and Karan Dalal and Jiarui Xu and Arjun Vikram and Genghan Zhang and Yann Dubois and Xinlei Chen and Xiaolong Wang and Oluwasanmi Koyejo and Tatsunori Hashimoto and Carlos Guestrin},
journal = {ArXiv},
year = {2024},
volume = {abs/2407.04620},
url = {https://api.semanticscholar.org/CorpusID:271039606}
}
@inproceedings{Yang2024GatedDN,
title = {Gated Delta Networks: Improving Mamba2 with Delta Rule},
author = {Songlin Yang and Jan Kautz and Ali Hatamizadeh},
year = {2024},
url = {https://api.semanticscholar.org/CorpusID:274598177}
}
@inproceedings{Nguyen2024TurningUT,
title = {Turning Up the Heat: Min-p Sampling for Creative and Coherent LLM Outputs},
author = {Minh Nguyen and Andrew Baker and Clement Neo and Allen Roush and Andreas Kirsch and Ravid Shwartz-Ziv},
year = {2024},
url = {https://api.semanticscholar.org/CorpusID:270870613}
}
@article{Zhu2024HyperConnections,
title = {Hyper-Connections},
author = {Defa Zhu and Hongzhi Huang and Zihao Huang and Yutao Zeng and Yunyao Mao and Banggu Wu and Qiyang Min and Xun Zhou},
journal = {ArXiv},
year = {2024},
volume = {abs/2409.19606},
url = {https://api.semanticscholar.org/CorpusID:272987528}
}
@article{Zhou2024ValueRL,
title = {Value Residual Learning For Alleviating Attention Concentration In Transformers},
author = {Zhanchao Zhou and Tianyi Wu and Zhiyun Jiang and Zhenzhong Lan},
journal = {ArXiv},
year = {2024},
volume = {abs/2410.17897},
url = {https://api.semanticscholar.org/CorpusID:273532030}
}
@software{Kyrylov_Accelerated_Scan_2024,
author = {Kyrylov, Volodymyr},
doi = {10.5281/zenodo.10600962},
title = {Accelerated Scan},
version = {0.1.2},
year = {2024}
}
@misc{wang2025testtimeregressionunifyingframework,
title = {Test-time regression: a unifying framework for designing sequence models with associative memory},
author = {Ke Alexander Wang and Jiaxin Shi and Emily B. Fox},
year = {2025},
eprint = {2501.12352},
archivePrefix = {arXiv},
primaryClass = {cs.LG},
url = {https://arxiv.org/abs/2501.12352},
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
titans_pytorch-0.4.7.tar.gz
(37.2 MB
view details)
Built Distribution
File details
Details for the file titans_pytorch-0.4.7.tar.gz
.
File metadata
- Download URL: titans_pytorch-0.4.7.tar.gz
- Upload date:
- Size: 37.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.22
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | aa922a1671025c23f1c1922b3f2d8be1e9adbee5b2dbe7662484ac70da89948c |
|
MD5 | 169257f66f7afa0e85d6b45e64adaed5 |
|
BLAKE2b-256 | 905ccb213c834b386ca34f94487961a22e5dd5ae39c9bb5f20e5c29f52e02fa2 |
File details
Details for the file titans_pytorch-0.4.7-py3-none-any.whl
.
File metadata
- Download URL: titans_pytorch-0.4.7-py3-none-any.whl
- Upload date:
- Size: 21.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.22
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0d708a9542b1133e30de6e3844fe8029f356001fbd6b4fef77c3fb4db9015c6f |
|
MD5 | 0ccb28dbe5ca2cb320edee8b3c45a813 |
|
BLAKE2b-256 | b9d8a11a760eba7b2c06d12e3e59d5ccf90bcc9fa6897a29cdb0315d0b926774 |