HpBandSter icon indicating copy to clipboard operation
HpBandSter copied to clipboard

Number of sampled configurations

Open HannahElisa opened this issue 6 years ago • 3 comments

Hello, I just tried out your nice package for hyperparameter optimization and it works well. I want to understand how many configs are sampled and with which budget they are executed so I looked into the paper and the source code. I recognized some differences between the paper and the implementation which in the end lead to different number of sampled configs. For instance the number of successive halving runs is self.max_SH_iter = -int(np.log(min_budget/max_budget)/np.log(eta)) + 1 in the code but in the paper :
(without the +1) This influences the number of sampled configs: n0 = int(np.floor((self.max_SH_iter)/(s+1)) * self.eta**s) Is there a reason for this difference?

Thanks for any help in advance!

HannahElisa avatar Nov 26 '19 14:11 HannahElisa

I thing here self.max_SH_iter represents the calculation of  

More precisely, if we look at the original paper, we see that

And here, self.max_SH_iter = -int(np.log(min_budget/max_budget)/np.log(eta)) + 1

Rayn2402 avatar Dec 08 '19 22:12 Rayn2402

Thanks for your answer. Could you please specify where exactly you found the equation for in the original paper? I searched for it also in the supplementary material but cannot find it there.

HannahElisa avatar Dec 09 '19 12:12 HannahElisa

If you're looking at Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

It's in the initialization line of Algorithm 1 at page 8.

Rayn2402 avatar Dec 09 '19 14:12 Rayn2402