Skip to main content

Beta-RecSys: Build, Evaluate and Tune Automated Recommender Systems

Project description

|Installation | Quick Start | Documentation | Contributing | Getting help | Citation|

Accord Project Logo

PyPI version Code Coverage CI Documentation Status GitHub Slack Status

Beta-RecSys an open source project for Building, Evaluating and Tuning Automated Recommender Systems. Beta-RecSys aims to provide a practical data toolkit for building end-to-end recommendation systems in a standardized way. It provided means for dataset preparation and splitting using common strategies, a generalized model engine for implementing recommender models using Pytorch with a lot of models available out-of-the-box, as well as a unified training, validation, tuning and testing pipeline. Furthermore, Beta-RecSys is designed to be both modular and extensible, enabling new models to be quickly added to the framework. It is deployable in a wide range of environments via pre-built docker containers and supports distributed parameter tuning using Ray.

Installation

conda

If you use conda, you can install it with:

conda install beta-rec

pip

If you use pip, you can install it with:

pip install beta-rec

Docker

We also provide docker image for you to run this project on any platform. You can use the image with:

  1. Pull image from Docker Hub

    docker pull betarecsys/beta-recsys:latest
    
  2. Start a docker container with this image (Make sure the port 8888 is available on you local machine, or you can change the port in the command)

    docker run -ti --name beta-recsys -p 8888:8888 -d beta-recsys
    
  3. Open Jupyter on a brower with this URL:

    http://localhost:8888
    
  4. Enter root as the password for the notebook.

Quick Start

Downloading and Splitting Datasets

from beta_rec.datasets.movielens import Movielens_100k
from beta_rec.data import BaseData
dataset = Movielens_100k()
split_dataset = dataset.load_leave_one_out(n_test=1)
data =  BaseData(split_dataset)

Training model with MatrixFactorization

config = {
    "config_file":"./configs/mf_default.json"
}
from beta_rec.recommenders import MatrixFactorization
model = MatrixFactorization(config)
model.train(data)
result = model.test(data.test[0])

where a default config josn file ./configs/mf_default.json will be loaded for traning the model.

Tuning Model Hyper-parameters

config = {
    "config_file":"../configs/mf_default.json",
    "tune":True,
}
tune_result = model.train(data)

Experiment with multiple models

from beta_rec.recommenders import MatrixFactorization
from beta_rec.experiment.experiment import Experiment

# Initialise recommenders with their default configuration file

config = {
    "config_file":"configs/mf_default.json"
}

mf_1 = MatrixFactorization(config)
mf_2 = MatrixFactorization(config)

# Run experiments of the recommenders on the selected dataset

Experiment(
  datasets=[data],
  models=[mf_1, mf_2],
).run()

where the model will tune the hyper-parameters according to the specifed tuning scheme (e.g. the default for MF).

Models

The following is a list of recommender models currently available in the repository, or to be implemented soon.

General Models

Model Paper Colab
MF Neural Collaborative Filtering vs. Matrix Factorization Revisited, arXiv’ 2020 Example In Colab
GMF Generalized Matrix Factorization, in Neural Collaborative Filtering, WWW 2017
MLP Multi-Layer Perceptron, in Neural Collaborative Filtering, WWW 2017
NCF Neural Collaborative Filtering, WWW 2017 Example In Colab
CMN Collaborative memory network for recommendation systems, SIGIR 2018
NGCF Neural graph collaborative filtering, SIGIR 2019 Example In Colab
LightGCN LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation, SIGIR 2020 Example In Colab
LCF Graph Convolutional Network for Recommendation with Low-pass Collaborative Filters
VAECF Variational autoencoders for collaborative filtering, WWW 2018

Sequential Models

Model Paper Colab
NARM Neural Attentive Session-based Recommendation, CIKM 2017
Caser Personalized Top-N Sequential Recommendation via Convolutional Sequence Embedding, WSDM 2018
GRU4Rec Session-based recommendations with recurrent neural networks, ICLR 2016
SasRec Self-attentive sequential recommendation. ICDM 2018 Example In Colab
MARank Multi-Order Attentive Ranking Model for Sequential Recommendation, AAAI 2019
NextItnet A Simple Convolutional Generative Network for Next Item Recommendation, WSDM 2019
BERT4Rec BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer, CIKM 2019
TiSASRec Time Interval Aware Self-Attention for Sequential Recommendation. WSDM'20

Recommendation Models with Auxiliary information

Baskets/Sessions

Model Paper Colab
Triple2vec Representing and recommending shopping baskets with complementarity, compatibility and loyalty, CIKM 2018 Example In Colab
VBCAR Variational Bayesian Context-aware Representation for Grocery Recommendation, arXiv’ 2019 Example In Colab
T-VBR

Knowledge Graph

If you want your model to be implemented by our maintenance team (or by yourself), please submit an issue following our community instruction.

Recent Changing Logs ---> See version release.

Contributing

This project welcomes contributions and suggestions. Please make sure to read the Contributing Guide before creating a pull request.

Community meeting

  • Meeting time: Saturday (1:30 – 2:30pm UTC +0) / (9:30 – 10:30pm UTC +8) Add Event
  • Meeting minutes: notes
  • Meeting recordings: [recording links]: Can be found in each meeting note.

Discussion channels

  • Slack: Slack Status
  • Mailing list: TBC

Citation

If you use Beta-RecSys in you research, we would appreciate citations to the following paper:

@inproceedings{meng2020beta,
  title={BETA-Rec: Build, Evaluate and Tune Automated Recommender Systems},
  author={Meng, Zaiqiao and McCreadie, Richard and Macdonald, Craig and Ounis, Iadh and Liu, Siwei and Wu, Yaxiong and Wang, Xi and Liang, Shangsong and Liang, Yucheng and Zeng, Guangtao and others},
  booktitle={Fourteenth ACM Conference on Recommender Systems},
  pages={588--590},
  year={2020}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

beta_rec-0.3.2.tar.gz (114.2 kB view details)

Uploaded Source

Built Distribution

beta_rec-0.3.2-py3-none-any.whl (178.1 kB view details)

Uploaded Python 3

File details

Details for the file beta_rec-0.3.2.tar.gz.

File metadata

  • Download URL: beta_rec-0.3.2.tar.gz
  • Upload date:
  • Size: 114.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.0.0.post20200309 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for beta_rec-0.3.2.tar.gz
Algorithm Hash digest
SHA256 6d8c68163b75602b716cad866355963695bd26f6a12ad5699a96911d9eb7beb5
MD5 24591e8a2e7269b35da0c55fb74a4950
BLAKE2b-256 ca96272cc37c0d1433cf7a8421bc0e6c9c9ff0805b7c1d9c70157443f680785d

See more details on using hashes here.

File details

Details for the file beta_rec-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: beta_rec-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 178.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.0.0.post20200309 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for beta_rec-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4604c6faaa1ab9807f6f52805a051a85f1ecf59adf890cf8c6e86448e775cc9e
MD5 77ce8deca054c49c685da5d3192ce866
BLAKE2b-256 eb7659b27bc21da4ed0ca6f27353f73134de4aa519ea12e23de58cb0511e906d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page