Bayesian Statistics conjugate prior distributions
Project description
Conjugate Prior
Python implementation of the conjugate prior table for Bayesian Statistics
See wikipedia page:
https://en.wikipedia.org/wiki/Conjugate_prior#Table_of_conjugate_distributions
Installation:
pip install conjugateprior
Supported Models:
BetaBinomial
 Useful for independent trials such as clicktroughrate (ctr), web visitor conversion.BetaBernoulli
 Same as above.GammaExponential
 Useful for churnrate analysis, cost, dwelltime.GammaPoisson
 Useful for time passed until event, as above.NormalNormalKnownVar
 Useful for modeling a centralized distribution with constant noise.NormalLogNormalKnownVar
 Useful for modeling a Length of a support phone call.InvGammaNormalKnownMean
 Useful for modeling the effect of a noise.InvGammaWeibullKnownShape
 Useful for reasoning about particle sizes over time.DirichletMultinomial
 Extension of BetaBinomial to more than 2 types of events (Limited support).
Basic API
model = GammaExponential(a, b)
 A Bayesian model with anExponential
likelihood, and aGamma
prior. Wherea
andb
are the prior parameters.model.pdf(x)
 Returns the probabilitydensityfunction of the prior function atx
.model.cdf(x)
 Returns the cumulativedensityfunction of the prior function atx
.model.mean()
 Returns the prior mean.model.plot(l, u)
 Plots the prior distribution betweenl
andu
.model.posterior(l, u)
 Returns the credible interval on(l,u)
(equivalent tocdf(u)cdf(l)
).model.update(data)
 Returns a new model after observingdata
.model.predict(x)
 Predicts the likelihood of observingx
(if a posterior predictive exists).model.sample()
 Draw a single sample from the posterior distribution.
Coin flip example:
from conjugate_prior import BetaBinomial
heads = 95
tails = 105
prior_model = BetaBinomial() # Uninformative prior
updated_model = prior_model.update(heads, tails)
credible_interval = updated_model.posterior(0.45, 0.55)
print ("There's {p:.2f}% chance that the coin is fair".format(p=credible_interval*100))
predictive = updated_model.predict(50, 50)
print ("The chance of flipping 50 Heads and 50 Tails in 100 trials is {p:.2f}%".format(p=predictive*100))
Variant selection with Multiarmedbandit
Assume we have 10
creatives (variants) we can choose for our ad campaign, at first we start with the uninformative prior.
After getting feedback (i.e. clicks) from displaying the ads, we update our model.
Then we sample the DirrechletMultinomial
model for the updated distribution.
from conjugate_prior import DirichletMultinomial
from collections import Counter
# Assuming we have 10 creatives
model = DirichletMultinomial(10)
mle = lambda M:[int(r.argmax()) for r in M]
selections = [v for k,v in sorted(Counter(mle(model.sample(100))).most_common())]
print("Percentage before 1000 clicks: ",selections)
# after a period of time, we got this array of clicks
clicks = [400,200,100,50,20,20,10,0,0,200]
model = model.update(clicks)
selections = [v for k,v in sorted(Counter(mle(model.sample(100))).most_common())]
print("Percentage after 1000 clicks: ",selections)
Naive Recommendation System with UCB
from conjugate_prior import BetaBinomialRanker ranker = BetaBinomialRanker(prior=0.1) # 10% clickthroughrate ranker["cmpgn1"]+=(1,9) # 1 click, 9 skips ranker["cmpgn2"]+=(10,90) # 10 click, 90 skips ranker["cmpgn3"]+=(1,2) # 1 click, 3 skips
Balance exploration and exploitation w/UCB
print(ranker.rank_by_ucb())
Project details
Release history Release notifications  RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for conjugate_prior0.85py3noneany.whl
Algorithm  Hash digest  

SHA256  ec8f2474f76d30808b0bf31a330eb8e4afac3e819e9e9f016668f5b6b2da7477 

MD5  0413660c9ae1584f04491022f553e703 

BLAKE2b256  9337194d0f34ac789d9f8bbf31891e3d021052688b7474d74ffb7bdd3bf6d79e 