Skip to main content

general deep algorithm for business, marketing and advertisement

Project description

Python Versions TensorFlow Versions Downloads PyPI Version GitHub Issues DOI License Activity

Introduction to gbiz_torch

A comprehensive toolkit package designed to help you accurately predict key metrics such as Click-Through Rates (CTR), Conversion Rates (CVR), uplift, and pricing strategies. Built with state-of-the-art algorithms and user-friendly interfaces, our package streamlines the process of forecasting and decision-making, allowing you to make data-driven choices with confidence. Whether you're looking to optimize your marketing campaigns, boost sales conversions, or fine-tune your pricing model, our package provides the insights you need to succeed in today's competitive market.

Tutorial

You can learn to use the package by referring to the examples in the directory ./example

More solution examples will be released soon~

Useful Eval Matrix

The following eval matrix has been implemented:

# Eval Matrix Explanation Note
1 AUC Area Under the ROC Curve For Classification
2 Confusion_Matrix Confusion Matrix is a performance measurement for classification For Classification
3 ACC_F1_score Accuracy, Macro-F1 and Weighted-F1 For Classification
4 Top_K_Acc top_k_accuracy_score For Classification
5 Multi_Class_RP Multi Class precision, recall and F-beta For Classification
6 r2_score R2_score For Classification
7 MAE Mean Absolute Error For Regression
8 MSE Mean Square Error For Regression
9 MAPE Mean Absolute Percentage Error For Regression
10 tsne t-distributed stochastic neighbor embedding For Manifold
11 sp_emb spectral decomposition to the corresponding graph laplacian For Manifold

Universal Layers in Commercial Algorithms

The models currently implemented in recommendation algorithms:

# Model Name model Note
1 Wide and Deep WndModel Traditional recommendations
2 DNN DNNModel Traditional recommendations
3 DeepFM DeepFMModel Traditional recommendations
4 Deep and Cross DCNModel Traditional recommendations
5 NFM NFMModel Traditional recommendations
6 Tower TowerModel Traditional recommendations
7 FLEN FLENModel Traditional recommendations
8 Fibinet FiBiNetModel Traditional recommendations
9 InterHAt InterHAtModel Traditional recommendations
10 CAN CANModel Traditional recommendations
11 MaskNet MaskNetModel Traditional recommendations
12 ContextNet ContextNetModel Traditional recommendations
13 EDCN EDCNModel Traditional recommendations
14 BertSeq Bert4RecModel Sequence recommendation
15 GRU4Rec GRU4RecModel Sequence recommendation
16 DIN DINModel Sequence recommendation
17 DCAP DCAPModel Sequence recommendation
18 FBAS FBASModel Sequence recommendation
19 ESMM ESMMModel Multi objective recommendation
20 MMoE GeneralMMoEModel Multi objective recommendation
21 Hard Sharing HardSharingModel Multi objective recommendation
22 Cross Sharing CrossSharingModel Multi objective recommendation
23 Cross Stitch CrossStitchModel Multi objective recommendation
24 PLE PLEModel Multi objective recommendation

Universal Layers in Commercial Algorithms

In the consolidated algorithms, the following Layer networks have been implemented, which can be conveniently called by higher-level models, or users can directly call the Layer layers to assemble their own models.

# Graph-based Layer Note
1 HOMOGNNLayer General GNN layers for Homogeneity Graph (GCNConv, GATConv, SAGEConv, TransformerConv, ARMAConv)
2 HETEGNNLayer General GNN layers for heterogeneous Graph (HGTConv,HANConv)
# Layer Note
1 DNNLayer DNN Net
2 FMLayer FM Net in DeepFM, NFM
3 CrossLayer Cross Net in Deep and Cross
4 CINLayer CIN Net in XDeepFM
5 MultiHeadAttentionLayer multi head attention in Bert
6 SelfAttentionLayer scaled dot self attention in Bert
7 LayerNorm Layer Normalization in Bert
8 PositionWiseFeedForwardLayer Position wise feed forward in Bert
9 TransformerLayer Transformer(including multi head attention and LayerNorm) in Bert
10 TransformerEncoder Multi-Transformer in Bert
11 AutoIntLayer Similar with TransformerLayer
12 FuseLayer Local Activation Unit in DIN
13 SENETLayer Squeeze and Excitation Layer
14 FieldWiseBiInteractionLayer FM and MF layer in FLEN
15 CrossStitchLayer Cross-stitch Networks for Multi-task Learning
16 GeneralMMoELayer Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts
17 Dice Dice activation function
18 PositionEncodingLayer Positional Encoding Layer in Transformer
19 CGCGatingNetworkLayer task and expert Net in PLE
20 BiLinearInteractionLayer Last feature net in Fibinet
21 CoActionLayer co-action unit layer in CAN
22 MaskBlockLayer MaskBlockLayer in MaskNet

Citation

If you find this code useful in your research, please cite it using the following BibTeX:

@software{
  Wang_gbiz_torch_A_comprehensive_2023,
  author = {Wang, Haowen},
  doi = {10.5281/zenodo.10222799},
  month = nov,
  title = {{gbiz_torch: A comprehensive toolkit for predicting key metrics in e-commercial fields}},
  url = {https://github.com/whw199833/gbiz_torch},
  version = {2.0.4},
  year = {2023}
}

or following APA:

Wang, H. (2023). gbiz_torch: A comprehensive toolkit for predicting key metrics in e-commercial fields (Version 2.0.4) [Computer software]. https://doi.org/10.5281/zenodo.10222799

Contact

If you have some questions or some advice, or want to contribute to this repo, do not hesitate to contact me:

mail: wanghw@zju.edu.cn

wechat: whw199833

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gbiz_torch-0.0.6.5.tar.gz (44.7 kB view details)

Uploaded Source

Built Distribution

gbiz_torch-0.0.6.5-py3-none-any.whl (67.3 kB view details)

Uploaded Python 3

File details

Details for the file gbiz_torch-0.0.6.5.tar.gz.

File metadata

  • Download URL: gbiz_torch-0.0.6.5.tar.gz
  • Upload date:
  • Size: 44.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.12

File hashes

Hashes for gbiz_torch-0.0.6.5.tar.gz
Algorithm Hash digest
SHA256 7702bffafbb2f974ed46a614cb9a9f4241c34f895efe827db24436af301ad575
MD5 85906a6ea7bfb13d2bed217963da31e7
BLAKE2b-256 d8282e1af15cc386e35ad8f213521847e09978b2ddde1892d19c844c0331f1e5

See more details on using hashes here.

File details

Details for the file gbiz_torch-0.0.6.5-py3-none-any.whl.

File metadata

  • Download URL: gbiz_torch-0.0.6.5-py3-none-any.whl
  • Upload date:
  • Size: 67.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.12

File hashes

Hashes for gbiz_torch-0.0.6.5-py3-none-any.whl
Algorithm Hash digest
SHA256 098569bed48a0766cf90e2b92a69f385e66bfe5ccb1282e8efd5f5d86cf1bee6
MD5 b5753aac18acb1a7ba9e60e8b2224d3b
BLAKE2b-256 af502491ce33a6faa354f751f1a174d7e19a211bcfc0460aeff61ecebf018a10

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page