botorch
botorch copied to clipboard
[Feature Request] Add an input constructor for qNegIntegratedPosteriorVariance
🚀 Feature Request
Add an input constructor for qNegIntegratedPosteriorVariance.
Motivation
I'd like to use qNegIntegratedPosteriorVariance with Ax and its BoTorch model bridge, but to do so I need the qNegIntegratedPosteriorVariance acquisition function to be added to botorch/acquisition/input_constructors.py.
Are you willing to open a pull request? (See CONTRIBUTING)
I have a very rough solution to this, but would appreciate some guidance on how to go about it properly. Thanks so much for all your efforts!
Hi @samueljamesbell, thanks for your interest in using qNegIntegratedPosteriorVariance in Ax.
Most of this should be pretty straightforward, you'd essentially add a clone of https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L448-L483 to botorch/acquisition/input_constructors.py, but instead of passing through beta you will have to somehow construct a suitable mc_points to serve as the locations to use for MC integration of the variance - this is the more challenging part.
The easiest way to do this would probably be to pass these points as acquisition_options to the Ax BoTorchModel , and then just pass that tensor through to your new input constructor. The issue with that is that you'll need to know the dimension and bounds of the search space (which is [0, 1]^d for basic cases, but things get more complicated e.g. if there are discrete features).
A more sophisticated version would grab the bounds from the Ax SearchSpaceDigest and stick them here so that you can use this in the input constructor. This would require changes on the Ax end, but it seems like this could be generally useful for other cases as well. Instead of passing the SearchSpaceDigest object down (which would create circular import issues between Ax and botorch) we probably would pass a dictionary version of the data. @lena-kashtelyan, any thoughts on this?
One other thing to note: The functionality in Ax (e.g. in the Service API) is mostly geared towards optimization, but of course things like get_best_trial may be less relevant in an active learning setting. However, if you build a GenerationStrategy with using qNegIntegratedPosteriorVariance as the acquisition function in the modular BoTorch model setup, then the exploration should do the right thing.
Closing as inactive, but let us know if this is still needed, or feel free to put in a PR.