Skip to main content

ToR[e]cSys is a PyTorch Framework to implement recommendation system algorithms, including but not limited to click-through-rate (CTR) prediction, learning-to-ranking (LTR), and Matrix/Tensor Embedding. The project objective is to develop a ecosystem to experiment, share, reproduce, and deploy in real world in a smooth and easy way.

Project description

ToR[e]cSys


News

It is happy to know the new package of Tensorflow Recommenders.


ToR[e]cSys is a PyTorch Framework to implement recommendation system algorithms, including but not limited to click-through-rate (CTR) prediction, learning-to-ranking (LTR), and Matrix/Tensor Embedding. The project objective is to develop a ecosystem to experiment, share, reproduce, and deploy in real world in a smooth and easy way (Hope it can be done).

Installation

TBU

Documentation

The complete documentation for ToR[e]cSys is available via ReadTheDocs website.
Thank you for ReadTheDocs! You are the best!

Implemented Models

1. Subsampling

Model Name Research Paper Year
Word2Vec Omer Levy et al, 2015. Improving Distributional Similarity with Lessons Learned from Word Embeddings 2015

2. Negative Sampling

Model Name Research Paper Year
TBU

3. Click through Rate (CTR) Model

Model Name Research Paper Year
Logistic Regression / /
Factorization Machine Steffen Rendle, 2010. Factorization Machine 2010
Factorization Machine Support Neural Network Weinan Zhang et al, 2016. Deep Learning over Multi-field Categorical Data: A Case Study on User Response Prediction 2016
Field-Aware Factorization Machine Yuchin Juan et al, 2016. Field-aware Factorization Machines for CTR Prediction 2016
Product Neural Network Yanru QU et al, 2016. Product-based Neural Networks for User Response Prediction 2016
Attentional Factorization Machine Jun Xiao et al, 2017. Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks 2017
Deep and Cross Network Ruoxi Wang et al, 2017. Deep & Cross Network for Ad Click Predictions 2017
Deep Factorization Machine Huifeng Guo et al, 2017. DeepFM: A Factorization-Machine based Neural Network for CTR Prediction 2017
Neural Collaborative Filtering Xiangnan He et al, 2017. Neural Collaborative Filtering 2017
Neural Factorization Machine Xiangnan He et al, 2017. Neural Factorization Machines for Sparse Predictive Analytics 2017
eXtreme Deep Factorization Machine Jianxun Lian et al, 2018. xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems 2018
Deep Field-Aware Factorization Machine Junlin Zhang et al, 2019. FAT-DeepFFM: Field Attentive Deep Field-aware Factorization Machine 2019
Deep Matching Correlation Prediction Wentao Ouyang et al, 2019. Representation Learning-Assisted Click-Through Rate Prediction 2019
Deep Session Interest Network Yufei Feng et al, 2019. Deep Session Interest Network for Click-Through Rate Prediction 2019
Elaborated Entire Space Supervised Multi Task Model Hong Wen et al, 2019. Conversion Rate Prediction via Post-Click Behaviour Modeling 2019
Entire Space Multi Task Model Xiao Ma et al, 2019. Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate 2019
Field Attentive Deep Field Aware Factorization Machine Junlin Zhang et al, 2019. FAT-DeepFFM: Field Attentive Deep Field-aware Factorization Machine 2019
Position-bias aware learning framework Huifeng Guo et al, 2019. PAL: a position-bias aware learning framework for CTR prediction in live recommender systems 2019

4. Embedding Model

Model Name Research Paper Year
Matrix Factorization / /
Starspace Ledell Wu et al, 2017 StarSpace: Embed All The Things! 2017

5. Learning-to-Rank (LTR) Model

Model Name Research Paper Year
Personalized Re-ranking Model Changhua Pei et al, 2019. Personalized Re-ranking for Recommendation 2019

Getting Started

There are several ways using ToR[e]cSys to develop a Recommendation System. Before talking about them, we first need to discuss about components of ToR[e]cSys.

A model in ToR[e]cSys is constructed by two parts mainly: inputs and model, and they will be wrapped into a sequential module (torecsys.models.sequential) to be trained by Trainer (torecsys.trainer.Trainer). \

For inputs module (torecsys.inputs), it will handle most kinds of inputs in recommendation system, like categorical features, images, etc, with several kinds of methods, including token embedding, pre-trained image models, etc.

For models module (torecsys.models), it will implement some famous models in recommendation system, like Factorization Machine family. I hope I can make the library rich. To construct a model in the module, in addition to the modules implemented in PyTorch, I will also implement some layers in torecsys.layers which are called by models usually.

After the explanation of ToR[e]cSys, let's move on to the Getting Started. We can use ToR[e]cSys in the following ways:

  1. Run by command-line (In development)

    </code></pre>
    </li>
    </ol>
    <blockquote>
    <p>torecsys build --inputs_config='{}' <br />
    --model_config='{"method":"FM", "embed_size": 8, "num_fields": 2}' <br />
    --regularizer_config='{"weight_decay": 0.1}' <br />
    --criterion_config='{"method": "MSELoss"}' <br />
    --optimizer_config='{"method": "SGD", "lr": "0.01"}' <br />
    ...
    ```</p>
    </blockquote>
    <ol start="2">
    <li>
    <p>Run by class method</p>
    <pre lang="python"><code>
    

import torecsys as trs

build trainer by class method

trainer = trs.trainer.Trainer()
.bind_objective("CTR")
.set_inputs()
.set_model(method="FM", embed_size=8, num_fields=2)
.set_sequential()
.set_regularizer(weight_decay=0.1)
.build_criterion(method="MSELoss")
.build_optimizer(method="SGD", lr="0.01")
.build_loader(name="train", ...)
.build_loader(name="eval", ...)
.set_targets_name("labels")
.set_max_num_epochs(10)
.use_cuda()

start to fit the model

trainer.fit() ```

  1. Run like PyTorch Module

    </code></pre>
    </li>
    </ol>
    <p>import torch
    import torch.nn as nn
    import torecsys as trs</p>
    <h1>some codes here</h1>
    <p>inputs = trs.inputs.InputsWrapper(schema=schema)
    model = trs.models.FactorizationMachineModel(embed_size=8, num_fields=2)</p>
    <p>for i in range(epochs):
    optimizer.zero_grad()
    outputs = model(**inputs(batches))
    loss = criterion(outputs, labels)
    loss.backward()
    optimizer.step()
    ```</p>
    <p>(In development) You can anyone you like to train a Recommender System and serve it in the following ways:</p>
    <ol>
    <li>
    <p>Run by command-line</p>
    <pre lang="bash"><code>> torecsys serve --load_from='{}'
    
  2. Run by class method

    </code></pre>
    </li>
    </ol>
    <p>import torecsys as trs</p>
    <p>serving = trs.serving.Model() <br />
    .load_from(filepath=filepath)
    .run()
    ```</p>
    <ol start="3">
    <li>
    <p>Serve it yourself</p>
    <pre lang="python"><code>
    

from flask import Flask, request import torecsys as trs

model = trs.serving.Model()
.load_from(filepath=filepath)

@app.route("/predict") def predict(): args = request.json inference = model.predict(args) return inference, 200

if name == "main": app.run() ```

For further details, please refer to the example in repository or read the documentation. Hope you enjoy~

Examples

TBU

Sample Codes

TBU

Sample of Experiments

TBU

Authors

License

ToR[e]cSys is MIT-style licensed, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torecsys-0.0.1.dev2.tar.gz (82.2 kB view details)

Uploaded Source

File details

Details for the file torecsys-0.0.1.dev2.tar.gz.

File metadata

  • Download URL: torecsys-0.0.1.dev2.tar.gz
  • Upload date:
  • Size: 82.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.8.2

File hashes

Hashes for torecsys-0.0.1.dev2.tar.gz
Algorithm Hash digest
SHA256 83d99b046247cdcb90f9df443e23eaefa95930b91932937ea00935f698d7215f
MD5 f067cb8c1ae005c43b0de5787ef03ade
BLAKE2b-256 45b4fd9cfec8ed74dec66eaa7b5e8e741cdc83190eac649390a2aa63b8cfe022

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page