Skip to main content

Lifestream data analysis with PyTorch

Project description

ptls-logo

GitHub license PyPI version GitHub issues Telegram

pytorch-lifestream or ptls a library built upon PyTorch for building embeddings on discrete event sequences using self-supervision. It can process terabyte-size volumes of raw events like game history events, clickstream data, purchase history or card transactions.

It supports various methods of self-supervised training, adapted for event sequences:

  • Contrastive Learning for Event Sequences (CoLES)
  • Contrastive Predictive Coding (CPC)
  • Replaced Token Detection (RTD) from ELECTRA
  • Next Sequence Prediction (NSP) from BERT
  • Sequences Order Prediction (SOP) from ALBERT
  • Masked Language Model (MLM) from ROBERTA

It supports several types of encoders, including Transformer and RNN. It also supports many types of self-supervised losses.

The following variants of the contrastive losses are supported:

Install from PyPi

pip install pytorch-lifestream

Install from source

# Ubuntu 20.04

sudo apt install python3.8 python3-venv
pip3 install pipenv

pipenv sync --dev # install packages exactly as specified in Pipfile.lock
pipenv shell
pytest

Demo notebooks

We have a demo notebooks here, some of them:

  • Supervised model training notebook Open In Colab
  • Self-supervided training and embeddings for downstream task notebook Open In Colab
  • Self-supervided embeddings in CatBoost notebook
  • Self-supervided training and fine-tuning notebook Open In Colab
  • Self-supervised TrxEncoder only training with Masked Language Model task and fine-tuning notebook
  • Pandas data preprocessing options notebook Open In Colab
  • PySpark and Parquet for data preprocessing notebook
  • Fast inference on large dataset notebook
  • Supervised multilabel classification notebook Open In Colab
  • CoLES multimodal notebook Open In Colab

And we have a tutorials here

Docs

Documentation

Library description index

Experiments on public datasets

pytorch-lifestream usage experiments on several public event datasets are available in the separate repo

PyTorch-LifeStream in ML Competitions

How to contribute

  1. Make your chages via Fork and Pull request.
  2. Write unit test for new code in ptls_tests.
  3. Check unit test via pytest: Example.

Citation

We have a paper you can cite it:

@inproceedings{sakhno2025pytorch,
  title={PyTorch-Lifestream: Learning Embeddings on Discrete Event Sequences},
  author={Sakhno, Artem and Kireev, Ivan and Babaev, Dmitrii and Savchenko, Maxim and Gusev, Gleb and Savchenko, Andrey},
  booktitle={Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence},
  pages={11104--11108},
  year={2025}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_lifestream-0.7.0.tar.gz (190.6 kB view details)

Uploaded Source

File details

Details for the file pytorch_lifestream-0.7.0.tar.gz.

File metadata

  • Download URL: pytorch_lifestream-0.7.0.tar.gz
  • Upload date:
  • Size: 190.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pytorch_lifestream-0.7.0.tar.gz
Algorithm Hash digest
SHA256 bdc88eea69a7db96a1a8df5055e2da0bbe2cc05944ac95fd48eb600a9002e3fc
MD5 7a5ce7aba176891a88563f2ac13c309a
BLAKE2b-256 b0274bf8c7cbe567223599d442a984935c2422369d57e52c30a9d5204471af11

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page