[Bug] Arguments passed to BoTorch optimize_acqf don't work (in an intuitive way) or don't make sense
I wanted to use nonlinear inequality constraints, which seems like it should be doable because BoTorch's optimize_acqf supports nonlinear inequality constraints. However, optimize_acqf happens after Ax has applied transforms, so arguments such as nonlinear_inequality_constraints and batch_initial_conditions operate in the transformed space, causing surprising behavior.
Example:
import torch
from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy
from ax.modelbridge.registry import Models
from ax.utils.testing.core_stubs import get_branin_data, get_branin_experiment
inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]
botorch_gen_step = GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {
"optimizer_kwargs": {
"nonlinear_inequality_constraints": inequality_constraints,
"batch_initial_conditions": torch.ones(((1, 1, 2))),
}
}
},
)
constrained_gs = GenerationStrategy(steps=[botorch_gen_step])
inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]
botorch_gen_step = GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {
"optimizer_kwargs": {
"nonlinear_inequality_constraints": inequality_constraints,
"batch_initial_conditions": torch.ones(((1, 1, 2))),
}
}
},
)
constrained_gs = GenerationStrategy(steps=[botorch_gen_step])
# {'x1': 10.0, 'x2': 15.0} -- does not obey the constraints
generator_run.arms[0].parameters
Suggested resolution:
I suggest not surfacing arguments to optimize_acqf to the user, possibly with a few exceptions added as needed. Although some of the arguments can be helpful when constructed by Ax, almost all are nonsensical, redundant with Ax, or will behave surprisingly when passed by the user. Redundant arguments include acq_function, bounds, q, and inequality_constraints. return_best_only is nonsensical when used with Ax. And others, such as nonlinear_inequality_constraints and batch_initial_conditions, operate in the transformed space and thus are nearly impossible to use correctly without a detailed understanding of what Ax does under the hood. Users with such a detailed understanding might as well use BoTorch.
I think this can be achieved by not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.
https://github.com/facebook/Ax/blob/d97a80e86c3398a0174fa7812a7b8c965e4d3827/ax/models/torch/botorch_modular/utils.py#L171
Similar considerations arise when passing arguments to the acquisition function constructors, see https://github.com/facebook/Ax/issues/2401?fbclid=IwZXh0bgNhZW0CMTAAAR3SSZxhZ_hdjkn5IdT2A5O5b8aP_isVXoxRsF_dEbeOHc5V2QyQTL4fd2M_aem_AZkcTeoS_8xcLndEm7ZcXeyhYwA9qC3o6FLOGVyUNuqch7oOKUchFM4VKxO_xXg6ZHgr2ONViqthDU6O8h-jAKoG#issuecomment-2079566395. It would be great if we could automatically apply the transforms to the arguments for which they are needed, but doing this in a generic fashion seems very challenging.
@esantorella, sounds like you were possibly working on this? I wonder if we want to just error with UnsupportedError if these are passed in Ax, since it seems that they will do nothing but confuse? I understand that there may be a contrived case where we are not using any Ax transforms, but does that ever happen in reality? If not, I think we could just validate against this.
I haven't been working on this. My preferred solution is
not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.
That may allow for more simplification, in that optimizer_kwargs will no longer need to be passed around, and the only allowed key for model_gen_options will then be acqf_kwargs:
# current
GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {"acqf_kwargs": {"eta": 1e-2}},
},
)
# new syntax after removing support for optimizer_kwargs
GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={"acqf_kwargs": {"eta": 1e-2}},
)
Not sure if i am completely right, but i feel these are following corrections to be made Redundant Code: You define inequality_constraints, botorch_gen_step, and constrained_gs twice. Remove one of the duplicate blocks,removing them can make it more effecient
Undefined Variable: generator_run this is not defined in your code itself
not completely sure, but this should be right code: import torch from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy from ax.modelbridge.registry import Models from ax.utils.testing.core_stubs import get_branin_data, get_branin_experiment
Define inequality constraint
inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]
Define a generation step with BoTorch model
botorch_gen_step = GenerationStep( model=Models.BOTORCH_MODULAR, num_trials=-1, model_gen_kwargs={ "model_gen_options": { "optimizer_kwargs": { "nonlinear_inequality_constraints": inequality_constraints, "batch_initial_conditions": torch.ones((1, 2)), # Adjusted shape if needed } } }, )
Define generation strategy
constrained_gs = GenerationStrategy(steps=[botorch_gen_step])
Get experiment and generate arms
experiment = get_branin_experiment() generator_run = constrained_gs.gen(experiment)
Print parameters of the generated arm
print(generator_run.arms[0].parameters)
if anything is wrong plzz reach out to me