bwheelz36
bwheelz36
Hey @till-m > "I assume -- based on the name -- that this is an integer counter storing how often points that were already probed have been re-probed?" correct! >...
Note - we could also increase the alpha parameter to model noise without using a white kernel: ```python k1 = Matern(length_scale=[3, 0.2, 0.2]) kernel = k1 optimizer.set_gp_params(kernel=kernel, alpha=1.1) ``` However...
> "I assume that you need to write your own optimization loop to get around the [caching](https://github.com/fmfn/BayesianOptimization/blob/a3021cc0b1a777c6ec714e545b37d4c1ff029380/bayes_opt/target_space.py#L299-L300) that happens internally?" I hadn't actually thought of that, I always run in...
I will update with a notebook and make the change suggested above once I'm more confident in what I'm doing!
Hey @till-m, I don't think I understand the problem (also I had to google what idempotent meant 😆). The custom error does sound more appropriate albeit relatively low priority?
ok @till-m, this should be fixed in #372. - added key word `allow_duplicate_points`. The default is False so the existing behavior of the code shouldnt change - added a parameter...
Anyone have any ideas on this? otherwise we can just close the issue I guess...
Have you read this example, item 2: https://github.com/fmfn/BayesianOptimization/blob/master/examples/advanced-tour.ipynb ? If so, how would what you propose differ? (sorry, I haven't read the paper!)
Certainly sounds interesting. I was wondering how this works with the acquisition function and I think their figure 1 explains that very nicely. I guess you saw my reply on...
@ptapping - sorry about slow response; I'm trying to work through all the open issues and pull requests.... - could you please make any random commit (e.g. add a space...