Sam Daulton
Sam Daulton
Hi @HBSaddiq, Thanks for raising the issue. A few questions: 1) What is are your objective thresholds (i.e. reference point)? This can significantly influence the optimization behavior. 2) What is...
MVaR is not differentiable, so gradient issues not terribly surprising. To get unblocked on this, a recommended alternative is to use MARS (https://proceedings.mlr.press/v162/daulton22a.html) which is way faster and differentiable than...
Glad that unblocked you! MARS optimizes MVaR by optimizing the VaR of random Chebyshev scalarizations. Since it scalarizes the problem, it uses a single-objective acquisition function.
@saitcakmak, did the differentiable MVaR version resolve the NaN issue?
Hi @lsassen, Can you share your code including setting up AxClient and the GenerationStrategy for repro? Thanks!
Closing due to lack of activity. Feel free to reopen @lsassen if the fix did not work.
Yes, that sounds quite plausible. qLogNEHVI (https://arxiv.org/abs/2310.20708) ---which is now the default in Ax for multi-objective optimization---should work quite a bit better when the AF surface is very flat.
That sounds great! Random Forests make sense (perhaps on the average value from multiple measurements if noise is a concern). Thanks!
Any update on this?
Hi @seabull, Feel free to open a pull request that adds this new parameter to `Models.THOMPSON`! Contributions are always welcome!