HpBandSter
HpBandSter copied to clipboard
Number of sampled configurations
Hello, I just tried out your nice package for hyperparameter optimization and it works well. I want to understand how many configs are sampled and with which budget they are executed so I looked into the paper and the source code. I recognized some differences between the paper and the implementation which in the end lead to different number of sampled configs.
For instance the number of successive halving runs is
self.max_SH_iter = -int(np.log(min_budget/max_budget)/np.log(eta)) + 1 in the code but
in the paper :
(without the +1)
This influences the number of sampled configs:
n0 = int(np.floor((self.max_SH_iter)/(s+1)) * self.eta**s)
Is there a reason for this difference?
Thanks for any help in advance!
I thing here self.max_SH_iter represents the calculation of
More precisely, if we look at the original paper, we see that
And here,
self.max_SH_iter = -int(np.log(min_budget/max_budget)/np.log(eta)) + 1
Thanks for your answer.
Could you please specify where exactly you found the equation for in the original paper? I searched for it also in the supplementary material but cannot find it there.
If you're looking at Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
It's in the initialization line of Algorithm 1 at page 8.