bandit algs
Project description
MASCed_bandits
This is a library of multi-armed bandit policies. As of the most recent version the following policies are included: UCB, UCB-Improved, EXP3, EXP3S, EXP4, EwS, ETC, Discounted UCB, Sliding Window UCB, e-greedy.
Example
from masced_bandits.bandit_options import initialize_arguments
from masced_bandits.bandits import init_bandit
import numpy as np
initialize_arguments(["Arm1","Arm2"], 0)
ucb_instance = init_bandit(name='UCB')
for i in range(100):
arms_chosen = []
reward = np.random.random()
arms_chosen.append(ucb_instance.get_next_arm(reward))
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for masced_bandits-EGAlberts-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 52097e62f72787d82f315e0edd5995fd4d4f32747d53e5bddb91778bfc07382f |
|
MD5 | ff42cdd418ada5084d58c2bacfe37959 |
|
BLAKE2b-256 | 7caea56f5eabc6a0b429be188c7b6a3d68fc88a4238cdc1a6f7816d12797f862 |
Close
Hashes for masced_bandits_EGAlberts-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cdb52818dbf89fb33914aaadb54e014a7aaa481cf0340fe92c636632e46df560 |
|
MD5 | 6f82e5bd769c97c85a6f3cc58b8964e7 |
|
BLAKE2b-256 | 0fb05aa6ea9d3fd8f12dc13c6c4e49e49bb90af0d68e45d331eba602c71c16c9 |