multi-arm-bandits topic
List
multi-arm-bandits repositories
SMPyBandits
376
Stars
57
Forks
Watchers
🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-playe...
MAB.jl
20
Stars
8
Forks
Watchers
A Julia Package for providing Multi Armed Bandit Experiments