botorch icon indicating copy to clipboard operation
botorch copied to clipboard

[Feature Request] Support inter-point inequality constraints for knowledge gradient [title updated]

Open danielrjiang opened this issue 3 years ago • 8 comments

🚀 Feature Request

Support inequality constraints for knowledge gradient (see Ax issue here: https://github.com/facebook/Ax/issues/961)

Motivation

Requested by @sgbaird for work on multi-fidelity optimization (follow up work to https://doi.org/10.26434/chemrxiv-2022-nz2w8).

danielrjiang avatar May 12 '22 17:05 danielrjiang

@danielrjiang Thanks for porting this over!

cc @ramseyissa

sgbaird avatar May 12 '22 20:05 sgbaird

Here's another (very recent) example motivating the use-case for constrained multi-fidelity optimization:

Khatamsaz, D.; Vela, B.; Singh, P.; Johnson, D. D.; Allaire, D.; Arróyave, R. Multi-Objective Materials Bayesian Optimization with Active Learning of Design Constraints: Design of Ductile Refractory Multi-Principal-Element Alloys. Acta Materialia 2022, 236, 118133. https://doi.org/10.1016/j.actamat.2022.118133.

sgbaird avatar Aug 25 '22 17:08 sgbaird

Also, is this only going to be compatible with single-objective optimization? Would that be a separate feature request for multi-objective optimization support for knowledge gradient?

sgbaird avatar Aug 25 '22 17:08 sgbaird

Related:

  • https://github.com/pytorch/botorch/issues/1100

sgbaird avatar Nov 18 '22 04:11 sgbaird

Hmm I'm not sure how it was determined that this was a BoTorch issue rather than an Ax issue. I tested qKnowledgeGradient in BoTorch with inequality constraints and it worked fine, but Ax KnowledgeGradient won't accept inequality constraints. It doesn't seem like it would be that hard to enable this in Ax, but I'm wondering if there's a BoTorch issue that makes this tricker than it looks.

esantorella avatar May 30 '23 23:05 esantorella

I feel like so long as the constraints are not across the elements of the q-batch ("inter-point"), but across the dimensions of each point in both the q-batch and the fantasy points that we use to compute the maximum posterior mean after conditioning, then we should be ok. If we want to have some fancier constraints that e.g. impose a budget constraint across the elements of the q-batch but not on the fantasy points, then we have to be more careful. I actually have some hacky partial implementation for this for another purpose, I can look into how easy it is to clean that up.

Balandat avatar May 30 '23 23:05 Balandat

Oh, I see the problem. TLDR "inter-point" constraints don't work in BoTorch or in Ax with QMC acquisition functions, but non-inter-point (intra-point) constraints ought to work in both.

What's going on and why it doesn't work: two-dimensional indices in a constraint define "inter-point" constraints across a q-batch. Although inter-point constraints are supported in functions like get_polytope_samples, they doesn't work with gen_one_shot_kg_initial_conditions because it's treating the problem as if there's one batch with size q * num_fantasies rather than n_fantasies batches with size q.

Here's a repro for code that will fail with an index error:

from botorch.acquisition import qKnowledgeGradient
from botorch.fit import fit_gpytorch_mll
from botorch.models import SingleTaskGP
from gpytorch.mlls import ExactMarginalLogLikelihood

x = torch.rand((20, 2))
y = x.sum(1, keepdim=True) - 1
model = SingleTaskGP(train_X=x, train_Y=y)
mll = ExactMarginalLogLikelihood(model.likelihood, model)
 _ = fit_gpytorch_mll(mll)

bounds = torch.tensor([[0, 0], [1, 1]]).to(x)
q = 2
# should have
# samples[:, 0, 0] * 1 + samples[:, 1, 1] >= 0.5
# where samples is n x q x d
indices = torch.tensor([[0, 0], [1, 1]])
inequality_constraints = [(indices, torch.ones(2).to(x), 0.5)]

acqf = qKnowledgeGradient(model, num_fantasies=8)
candidates, acq_vals = optimize_acqf(
        acq_function=acqf,
        bounds=bounds,
        q=q,
        num_restarts=10,
        raw_samples=64,
        inequality_constraints=inequality_constraints,
)

So

  1. This should be doable by tracking q and the number of fantasies separately
  2. If this remains unsupported, then gen_one_shot_kg_initial_conditions should error when provided inter-point equality_constraints or inequality_constraints
  3. Ax should not error when provided constraints that are not inter-point

I'm going to leave this task open for 1, then create separate tasks for 2 and 3.

esantorella avatar May 31 '23 22:05 esantorella

Ax should not error when provided constraints that are not inter-point

I don't think that Ax currently supports any kind of notion of inter-point constraints, only intra-point constraints. It doesn't really make sense to define them on the level of the search space (where intra-point constraints can be specified), so we'd have to somehow add that to another place in the API. An obvious contender would be Modelbridge.gen if n>1 or AxClient.get_next_trials (though we'd have to allow for joint optimization of the candidates in a batch, currently that function is based on sequential greedy generation with conditioning.

Balandat avatar Jun 01 '23 05:06 Balandat