Max Balandat

Results 476 comments of Max Balandat

As you said, the standard parameter constraints do not support complex nonlinear constraints. This is for a few reasons, not least that this makes the acquisition function optimization a lot...

@StanleyYoo since these are highly nonlinear constraints, it's not straightforward to support them easily via the `AxClient` API. You essentially have two options: 1. Use our low level API and...

One thing to note is that HV-based acquisition functions generally don't scale well to problems with many objectives. 2-3 is generally fine, for 4+ you'll likely see a pretty substantial...

So what you're looking at is a hierarchical search space. We have some (basic) support for this (https://github.com/facebook/Ax/blob/fcc51788947832981bead957be61d96ac7b1ee9c/ax/core/parameter.py#L515-L516, https://github.com/facebook/Ax/blob/fcc51788947832981bead957be61d96ac7b1ee9c/ax/core/search_space.py#L437), but I'm not sure how such a setup would interact with...

It also seems like we should be able to parse the one with spaces just fine... Seems like we should be doing some better string normalization.

> but how can run the main optmization to get the multiply sugesstion of the papameters values and not sequential based on the above code I'm still geeting one sugesstion...

> in this case I do not need to use choose_generation_strateg and no need to set the use_batch_trials=True if I understand you correct? Yes. > are both give same accuracy...

You can follow the same steps for MOO, no need to use `BatchTrial`s for MOO. > which is easy to understand it but in this case I will be not...

I am not sure why in the case of using a `ConstantDiagLinearOperator` gradients aren't being returned properly (hence the test failure).