Skip to main content

A configurable, tunable, and reproducible library for CTR prediction

Project description

Logo
Python version Pytorch version Pytorch version Pypi version Downloads License

Click-through rate (CTR) prediction is a critical task for various industrial applications such as online advertising, recommender systems, and sponsored search. FuxiCTR provides an open-source library for CTR prediction, with key features in configurability, tunability, and reproducibility. We hope this project could promote reproducible research and benefit both researchers and practitioners in this field.

Key Features

  • Configurable: Both data preprocessing and models are modularized and configurable.

  • Tunable: Models can be automatically tuned through easy configurations.

  • Reproducible: All the benchmarks can be easily reproduced.

  • Extensible: It can be easily extended to any new models, supporting both Pytorch and Tensorflow frameworks.

Model Zoo

No Publication Model Paper Benchmark Version
:open_file_folder: Feature Interaction Models
1 WWW'07 LR Predicting Clicks: Estimating the Click-Through Rate for New Ads :triangular_flag_on_post:Microsoft :arrow_upper_right: torch
2 ICDM'10 FM Factorization Machines :arrow_upper_right: torch
3 CIKM'13 DSSM Learning Deep Structured Semantic Models for Web Search using Clickthrough Data :triangular_flag_on_post:Microsoft :arrow_upper_right: torch
4 CIKM'15 CCPM A Convolutional Click Prediction Model :arrow_upper_right: torch
5 RecSys'16 FFM Field-aware Factorization Machines for CTR Prediction :triangular_flag_on_post:Criteo :arrow_upper_right: torch
6 RecSys'16 DNN Deep Neural Networks for YouTube Recommendations :triangular_flag_on_post:Google :arrow_upper_right: torch, tf
7 DLRS'16 Wide&Deep Wide & Deep Learning for Recommender Systems :triangular_flag_on_post:Google :arrow_upper_right: torch, tf
8 ICDM'16 PNN Product-based Neural Networks for User Response Prediction :arrow_upper_right: torch
9 KDD'16 DeepCrossing Deep Crossing: Web-Scale Modeling without Manually Crafted Combinatorial Features :triangular_flag_on_post:Microsoft :arrow_upper_right: torch
10 NIPS'16 HOFM Higher-Order Factorization Machines :arrow_upper_right: torch
11 IJCAI'17 DeepFM DeepFM: A Factorization-Machine based Neural Network for CTR Prediction :triangular_flag_on_post:Huawei :arrow_upper_right: torch, tf
12 SIGIR'17 NFM Neural Factorization Machines for Sparse Predictive Analytics :arrow_upper_right: torch
13 IJCAI'17 AFM Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks :arrow_upper_right: torch
14 ADKDD'17 DCN Deep & Cross Network for Ad Click Predictions :triangular_flag_on_post:Google :arrow_upper_right: torch, tf
15 WWW'18 FwFM Field-weighted Factorization Machines for Click-Through Rate Prediction in Display Advertising :triangular_flag_on_post:Oath, TouchPal, LinkedIn, Alibaba :arrow_upper_right: torch
16 KDD'18 xDeepFM xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems :triangular_flag_on_post:Microsoft :arrow_upper_right: torch
17 CIKM'19 FiGNN FiGNN: Modeling Feature Interactions via Graph Neural Networks for CTR Prediction :arrow_upper_right: torch
18 CIKM'19 AutoInt/AutoInt+ AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks :arrow_upper_right: torch
19 RecSys'19 FiBiNET FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction :triangular_flag_on_post:Sina Weibo :arrow_upper_right: torch
20 WWW'19 FGCNN Feature Generation by Convolutional Neural Network for Click-Through Rate Prediction :triangular_flag_on_post:Huawei :arrow_upper_right: torch
21 AAAI'19 HFM/HFM+ Holographic Factorization Machines for Recommendation :arrow_upper_right: torch
22 Arxiv'19 DLRM Deep Learning Recommendation Model for Personalization and Recommendation Systems :triangular_flag_on_post:Facebook :arrow_upper_right: torch
23 NeuralNetworks'20 ONN Operation-aware Neural Networks for User Response Prediction :arrow_upper_right: torch, tf
24 AAAI'20 AFN/AFN+ Adaptive Factorization Network: Learning Adaptive-Order Feature Interactions :arrow_upper_right: torch
25 AAAI'20 LorentzFM Learning Feature Interactions with Lorentzian Factorization :triangular_flag_on_post:eBay :arrow_upper_right: torch
26 WSDM'20 InterHAt Interpretable Click-through Rate Prediction through Hierarchical Attention :triangular_flag_on_post:NEC Labs, Google :arrow_upper_right: torch
27 DLP-KDD'20 FLEN FLEN: Leveraging Field for Scalable CTR Prediction :triangular_flag_on_post:Tencent :arrow_upper_right: torch
28 CIKM'20 DeepIM Deep Interaction Machine: A Simple but Effective Model for High-order Feature Interactions :triangular_flag_on_post:Alibaba, RealAI :arrow_upper_right: torch
29 WWW'21 FmFM FM^2: Field-matrixed Factorization Machines for Recommender Systems :triangular_flag_on_post:Yahoo :arrow_upper_right: torch
30 WWW'21 DCN-V2 DCN V2: Improved Deep & Cross Network and Practical Lessons for Web-scale Learning to Rank Systems :triangular_flag_on_post:Google :arrow_upper_right: torch
31 CIKM'21 DESTINE Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
32 CIKM'21 EDCN Enhancing Explicit and Implicit Feature Interactions via Information Sharing for Parallel Deep CTR Models :triangular_flag_on_post:Huawei :arrow_upper_right: torch
33 DLP-KDD'21 MaskNet MaskNet: Introducing Feature-Wise Multiplication to CTR Ranking Models by Instance-Guided Mask :triangular_flag_on_post:Sina Weibo :arrow_upper_right: torch
34 SIGIR'21 SAM Looking at CTR Prediction Again: Is Attention All You Need? :triangular_flag_on_post:BOSS Zhipin :arrow_upper_right: torch
35 KDD'21 AOANet Architecture and Operation Adaptive Network for Online Recommendations :triangular_flag_on_post:Didi Chuxing :arrow_upper_right: torch
36 AAAI'23 FinalMLP FinalMLP: An Enhanced Two-Stream MLP Model for CTR Prediction :triangular_flag_on_post:Huawei :arrow_upper_right: torch
37 SIGIR'23 FinalNet FINAL: Factorized Interaction Layer for CTR Prediction :triangular_flag_on_post:Huawei :arrow_upper_right: torch
38 SIGIR'23 EulerNet EulerNet: Adaptive Feature Interaction Learning via Euler's Formula for CTR Prediction :triangular_flag_on_post:Huawei :arrow_upper_right: torch
39 CIKM'23 GDCN Towards Deeper, Lighter and Interpretable Cross Network for CTR Prediction :triangular_flag_on_post:Microsoft torch
40 ICML'24 WuKong Wukong: Towards a Scaling Law for Large-Scale Recommendation :triangular_flag_on_post:Meta torch
41 Arxiv'24 DCNv3 DCNv3: Towards Next Generation Deep Cross Network for Click-Through Rate Prediction :arrow_upper_right: torch
:open_file_folder: Behavior Sequence Modeling
42 KDD'18 DIN Deep Interest Network for Click-Through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
43 AAAI'19 DIEN Deep Interest Evolution Network for Click-Through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
44 DLP-KDD'19 BST Behavior Sequence Transformer for E-commerce Recommendation in Alibaba :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
45 CIKM'20 DMIN Deep Multi-Interest Network for Click-through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
46 AAAI'20 DMR Deep Match to Rank Model for Personalized Click-Through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
47 DLP-KDD'22 ETA Efficient Long Sequential User Data Modeling for Click-Through Rate Prediction :triangular_flag_on_post:Alibaba torch
48 CIKM'22 SDIM Sampling Is All You Need on Modeling Long-Term User Behaviors for CTR Prediction :triangular_flag_on_post:Meituan torch
49 KDD'23 TransAct TransAct: Transformer-based Realtime User Action Model for Recommendation at Pinterest :triangular_flag_on_post:Pinterest :arrow_upper_right: torch
:open_file_folder: Dynamic Weight Network
50 NeurIPS'22 APG APG: Adaptive Parameter Generation Network for Click-Through Rate Prediction :triangular_flag_on_post:Alibaba :arrow_upper_right: torch
51 KDD'23 PPNet PEPNet: Parameter and Embedding Personalized Network for Infusing with Personalized Prior Information :triangular_flag_on_post:KuaiShou :arrow_upper_right: torch
:open_file_folder: Multi-Task Modeling
52 Arxiv'17 ShareBottom An Overview of Multi-Task Learning in Deep Neural Networks torch
53 KDD'18 MMoE Modeling Task Relationships in Multi-task Learning with Multi-Gate Mixture-of-Experts :triangular_flag_on_post:Google torch
54 KDD'18 PLE Progressive Layered Extraction (PLE): A Novel Multi-Task Learning (MTL) Model for Personalized Recommendations :triangular_flag_on_post:Tencent torch
:open_file_folder: Multi-Domain Modeling
55 KDD'23 PEPNet PEPNet: Parameter and Embedding Personalized Network for Infusing with Personalized Prior Information :triangular_flag_on_post:KuaiShou torch

Benchmarking

We have benchmarked FuxiCTR models on a set of open datasets as follows:

Dependencies

FuxiCTR has the following dependencies:

  • python 3.9+
  • pytorch 1.10+ (required only for torch models)
  • tensorflow 2.1+ (required only for tensorflow models)

Please install other required packages via pip install -r requirements.txt.

Quick Start

  1. Run the demo examples

    Examples are provided in the demo directory to show some basic usage of FuxiCTR. Users can run the examples for quick start and to understand the workflow.

    cd demo
    python example1_build_dataset_to_parquet.py
    python example2_DeepFM_with_parquet_input.py
    
  2. Run a model on tiny data

    Users can easily run each model in the model zoo following the commands below, which is a demo for running DCN. In addition, users can modify the dataset config and model config files to run on their own datasets or with new hyper-parameters. More details can be found in the README.

    cd model_zoo/DCN/DCN_torch
    python run_expid.py --expid DCN_test --gpu 0
    
    # Change `MODEL` according to the target model name
    cd model_zoo/MODEL_PATH
    python run_expid.py --expid MODEL_test --gpu 0
    
  3. Run a model on benchmark datasets (e.g., Criteo)

    Users can follow the benchmark section to get benchmark datasets and running steps for reproducing the existing results. Please see an example here: https://github.com/reczoo/BARS/tree/main/ranking/ctr/DCNv2/DCNv2_criteo_x1

  4. Implement a new model

    The FuxiCTR library is designed to be modularized, so that every component can be overwritten by users according to their needs. In many cases, only the model class needs to be implemented for a new customized model. If data preprocessing or data loader is not directly applicable, one can also overwrite a new one through the core APIs. We show a concrete example which implements our new model FinalMLP that has been recently published in AAAI 2023.

  5. Tune hyper-parameters of a model

    FuxiCTR currently support fast grid search of hyper-parameters of a model using multiple GPUs. The following example shows the grid search of 8 experiments with 4 GPUs.

    cd experiment
    python run_param_tuner.py --config config/DCN_tiny_parquet_tuner_config.yaml --gpu 0 1 2 3 0 1 2 3
    

🔥 Citation

If you find our code or benchmarks helpful in your research, please cite the following papers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fuxictr-2.3.5.tar.gz (57.8 kB view details)

Uploaded Source

Built Distribution

fuxictr-2.3.5-py3-none-any.whl (88.1 kB view details)

Uploaded Python 3

File details

Details for the file fuxictr-2.3.5.tar.gz.

File metadata

  • Download URL: fuxictr-2.3.5.tar.gz
  • Upload date:
  • Size: 57.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for fuxictr-2.3.5.tar.gz
Algorithm Hash digest
SHA256 72cb298984e7dfa9205f7e270b1b27991104fe74ed6e44a08ccf6f2c9695b80f
MD5 d35450f0c22acd1a1d2e794eed3a0701
BLAKE2b-256 86344e913c632684aa7548a122f8b0fe4aa62eb4eb89795f56adb8e233f8cda8

See more details on using hashes here.

Provenance

The following attestation bundles were made for fuxictr-2.3.5.tar.gz:

Publisher: pypi.yml on reczoo/FuxiCTR

Attestations:

File details

Details for the file fuxictr-2.3.5-py3-none-any.whl.

File metadata

  • Download URL: fuxictr-2.3.5-py3-none-any.whl
  • Upload date:
  • Size: 88.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for fuxictr-2.3.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b54b6cc8a05a269ffa7d2727aaf26c6fb6920f0dbed73ff5ae6a3aae917245ba
MD5 e065d5351b7c501a050bb2faf1fd9655
BLAKE2b-256 2065dae503c73201e356d4a8f5c1683ca055d02aa0312e16c01b58862f7e0d75

See more details on using hashes here.

Provenance

The following attestation bundles were made for fuxictr-2.3.5-py3-none-any.whl:

Publisher: pypi.yml on reczoo/FuxiCTR

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page