Eytan Bakshy
Eytan Bakshy
Are you using the default model? How are you normalizing your inputs? Is it possible that you are injecting more noise than you think you? It might be helpful to...
That’s a good question. Perhaps the noise level isn’t very high? The standard errors are also very wide, so there aren’t many replicates. 10 random restarts can also be a...
Eytan from the BoTorch team here. I was just checking out COMBO—it would be great to have COMBO in BoTorch/Ax, as handling of categorical inputs is a commonly requested feature....
You can use approximate HV, which will be a lot faster but I highly recommend not optimizing so many objectives since the surface area for 4 dimensions is quite large...
If you'd like to do active learning, qNegIntegratedPosteriorVariance should perform better than greedily maximizing the posterior variance, since it directly targets reduction in *global* variance. Matthew, have you done any...
My intuition is that it is hard to beat space-filling designs for low-dimensional inputs, and active learning is going to contribute more value once you start hitting > 4 dimensions....
Hi Eric, this sounds like a very interesting application! If you have 7 objectives, you may wish to come up with some linear scalarization of these objectives, or use preference...
Hi all, I would definitely recommend “reshuffling” (or simply creating a new experiment) for each batch. Otherwise you have carryover effects. Variance reduction is always a good idea. We use...
That is correct @lena-kashtelyan , people should not be using EHVI-based methods for 7+ dimensional objectives. I am not sure what the default approximation values are for the approximate HV...
Why not just get rid of the white spaces in the string? On Wed, May 1, 2024 at 4:13 PM Sterling G. Baird ***@***.***> wrote: > The constraint: > >...