BanditEmpirical
BanditEmpirical copied to clipboard
Empirical tests of various bandit algorithms.
Empirical Evaluation of Bandit Algorithms
Using Yahoo! Webscope TODAY article click data to test a variety of bandit algorithms.
Currently expect to test:
- Epsilon-greedy
- UCB (context-less)
- UCB (indexed)
- GLM-UCB
- Thompson Sampling