Skip to main content

Memory-Augmented Sequence Models in Pytorch

Project description

🌌 OpenTitans

The Open-Source Framework for Memory-Augmented Sequence Models

License: MIT Python 3.10+ PyTorch 2.0+ Maintenance


Democratizing Test-Time Memorization and Neural Memory Architectures.

IntroductionFeaturesQuick StartUsageCitations

🌟 Introduction

OpenTitans is a modular, high-performance framework designed to implement and explore the next generation of sequence models. While Transformers revolutionized AI, their quadratic context limitations have met their match.

Inspired by groundbreaking research from Google and other top labs, OpenTitans focuses on Memory-Augmented Models that learn to memorize, optimize, and cache their internal states at test time. Our goal is to provide a "HuggingFace-like" experience for researchers and engineers building the future of infinite-context modeling.


🚀 Key Features

  • 🧠 Neural Memory Modules: Sophisticated implementations of memory systems that evolve during inference.
  • ⚡ Associative Scan: Optimized support for fast sequence processing, breaking the linear bottleneck.
  • 🛠️ Test-Time Training (TTT): Seamless integration of models that optimize their parameters as they consume context.
  • 🧩 Modular & Extensible: A clean, object-oriented API that allows you to swap memory models, poolers, and norms with ease.
  • 📊 Performance Benchmarks: Built-in tools to measure latency, throughput, and memory efficiency on modern GPUs.

📦 Quick Start

Installation

[!TIP] We recommend using a virtual environment (venv or conda) for the best experience.

pip install open-titans

Development Installation

# Clone the repository
git clone https://github.com/Neeze/OpenTitans.git
cd OpenTitans

# Install in editable mode with dependencies
pip install -e .

🤝 Contributing

We are looking for "Titans" to help us build! 🚀

Whether you want to implement a new paper, optimize a CUDA kernel, or just fix a typo, your contributions are welcome. Check out our CONTRIBUTING.md to get started.


📚 Citations & Acknowledgements

OpenTitans stands on the shoulders of giants. We acknowledge the authors of the following papers for their foundational work:

@misc{behrouz2024titanslearningmemorizetest,
      title={Titans: Learning to Memorize at Test Time}, 
      author={Ali Behrouz and Peilin Zhong and Vahab Mirrokni},
      year={2024},
      url={https://arxiv.org/abs/2501.00663}
}

@misc{behrouz2025atlaslearningoptimallymemorize,
      title={ATLAS: Learning to Optimally Memorize the Context at Test Time}, 
      author={Ali Behrouz and Zeman Li and Praneeth Kacham and Majid Daliri and Yuan Deng and Peilin Zhong and Meisam Razaviyayn and Vahab Mirrokni},
      year={2025},
      url={https://arxiv.org/abs/2505.23735}
}

@misc{behrouz2025itsconnectedjourneytesttime,
      title={It's All Connected: A Journey Through Test-Time Memorization, Attentional Bias, Retention, and Online Optimization}, 
      author={Ali Behrouz and Meisam Razaviyayn and Peilin Zhong and Vahab Mirrokni},
      year={2025},
      url={https://arxiv.org/abs/2504.13173}
}

📄 License

OpenTitans is released under the MIT License. See LICENSE for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_titans-0.0.2.tar.gz (27.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

open_titans-0.0.2-py3-none-any.whl (34.2 kB view details)

Uploaded Python 3

File details

Details for the file open_titans-0.0.2.tar.gz.

File metadata

  • Download URL: open_titans-0.0.2.tar.gz
  • Upload date:
  • Size: 27.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for open_titans-0.0.2.tar.gz
Algorithm Hash digest
SHA256 84cd490a223e9c6540d4e9d5af238bf4b5a8a478fcd5f3d8d013cded15963168
MD5 a150a935c3cb203eb25bc615c47c5047
BLAKE2b-256 9d46f352cf51ac8b53f869956a5681063f12ace3289ded3f22e0e2a3f5fa570e

See more details on using hashes here.

Provenance

The following attestation bundles were made for open_titans-0.0.2.tar.gz:

Publisher: python-publish.yml on Neeze/OpenTitans

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file open_titans-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: open_titans-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 34.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for open_titans-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 130bc3ee16b86e8665dd6b2668939f4420f480db6c45fb95fff1caf2b8d43368
MD5 625396fe4b2fa2a8f9fe2d2759b71b19
BLAKE2b-256 331af0711b0ff22195eb2c03da2bf459830c556b6321e016b9d8da24b1a851ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for open_titans-0.0.2-py3-none-any.whl:

Publisher: python-publish.yml on Neeze/OpenTitans

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page