BayesianOptimization
BayesianOptimization copied to clipboard
Add parameter typing (only float and integer)
Add optional parameter "ptypes" in the BayesianOptimization class in order to indicate le type of each parameter to optimize into a dictionary. For now, every parameters should be typed. The goal is mainly to validate the method. If a parameter type is declared as integer, then the extremal position is chosen as follow. The outcome (float) of the minimization function is transformed into two positions (lower and upper integer). The minimum is the minimal value took by the acquisition function among the lower and upper positions. In the case of multiple integer parameters, the current implementation do not combine every lower and upper integer values. The only combinations tested are all lowers versus all uppers.
Wait for comments !
Codecov Report
Merging #131 into master will increase coverage by
0.59%
. The diff coverage is81.96%
.
@@ Coverage Diff @@
## master #131 +/- ##
==========================================
+ Coverage 70.92% 71.52% +0.59%
==========================================
Files 6 6
Lines 399 446 +47
Branches 44 61 +17
==========================================
+ Hits 283 319 +36
- Misses 112 118 +6
- Partials 4 9 +5
Impacted Files | Coverage Δ | |
---|---|---|
bayes_opt/bayesian_optimization.py | 77.31% <100%> (ø) |
:arrow_up: |
bayes_opt/target_space.py | 95.14% <80%> (-4.86%) |
:arrow_down: |
bayes_opt/util.py | 83.33% <82.35%> (-1.43%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 9b8bafb...7c026d7. Read the comment docs.
author updated this?
I am using the updates for my own works. Improvement might be done, but I am waiting for first comments about the proposed solution. I don't know if it should be merged at the end.
May I suggest an alternative indication method. In order to make it so that indicating the type of one parameter should not mean doing a whole dictionary, something like this?
>>> pbounds = {'p1': (0,1), 'p2': (0, 100, int)}
>>> [item[1][:2] for item in pbounds.items()]
[(0, 1), (0, 100)]
>>> [float if len(item[1])<3 else item[1][2] for item in pbounds.items()]
[<class 'float'>, <class 'int'>]
where a 3 element to the tupple is added with the type, otherwise float is considered. Thoughts?
I very much like the added value for BO with typing of the parameters! However, this could be still improved so that the parameters would get returned casted in their corresponding type. I mean, even if I define, e.g. max_depth:int
, the parameter will get fed to the functio-to-be-optimized as float
. This caused me some headache, when I was trying to figure out why my classifier kept on throwing fit-errors at me.
Nice idea here, till now, in the 'black_box_function' I was converting that float to int, it isn't the best but happened to be a convenient solution.