Online Ranking with Multi-Armed-Bandits
A library for Online Ranking with Multi-Armed-Bandits. This library is useful to find the best Top N items among a relatively small candidate set.
- Show the Top 5 News Articles among 200 articles every day
- Recommend the Top 10 Trending Products every week
- Rank components on websites to drive more engagement (Whole Page Optimization)
mab-ranking requires the following:
- Python (3.6, 3.7, 3.8)
At the command line:
pip install mab-ranking
Let's say that you want to recommend the top 5 trending products to your website visitors every week among the 300 most selling products. Every Monday at some time T you'll define a new
RankBandit implementation. For example:
num_ranks = 5 num_arms = 300 rank_bandit = IndependentBandits(num_ranks, BetaThompsonSampling, num_arms=num_arms)
Then, every time a visitor X is landed on your home page for the rest of the week, you need to select which 5 products to show them in section
Top Trending Products. So, you'll do the following:
selected_arms = rank_bandit.choose()
Let's say that the
[30, 2, 200, 42]. That means that you need to show products, 30, 2, 200 and 42 in this order. You can keep your own mapping from product UUIDS to integer arm ids in your app's business logic.
The visitor X viewed this selected order. Let's say that (s)he clicked on products 2 and 42. Then the
rewards list will be
[0.0, 1.0, 0.0, 1.0]. So:
rewards = [0.0, 1.0, 0.0, 1.0] rank_bandit.choose(selected_arms, rewards)
IndependentBandits: from Microsoft Paper, "A Fast Bandit Algorithm for Recommendations to Users with Heterogeneous"
BetaThompsonSampling: Beta Thompson Sampling MAB
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Hashes for mab_ranking-0.0.1-py2.py3-none-any.whl