[Bug]: BoTorch changes GPyTorch's default settings on import
What happened?
I think this issue is already well-known, but I noticed there isn't an issue that actually tracks it.
When BoTorch is imported, it globally changes the behaviour of GPyTorch: https://github.com/pytorch/botorch/blob/07ce3763b3d4336ad6ba9ba02195a53d607fb59b/botorch/init.py#L36-L49
In a production context, this makes it complicated to adopt BoTorch as an additional dependency of a package which already depends on GPyTorch, as it will change GPyTorch behaviour in pre-existing parts of the code.
As a workaround, we use a context manager that sets the BoTorch-specific default settings when actually using BoTorch models, but otherwise they remain as they are in GPyTorch. Maybe something like this could be implemented inside BoTorch (when fitting a model or optimizing an acquisition function).
Please provide a minimal, reproducible example of the unexpected behavior.
N/A
Please paste any relevant traceback/logs produced by the example provided.
BoTorch Version
0.14.0
Python Version
No response
Operating System
No response
(Optional) Describe any potential fixes you've considered to the issue outlined above.
No response
Pull Request
Yes
Code of Conduct
- [x] I agree to follow BoTorch's Code of Conduct
Yeah, I understand this isn't great behavior.
Maybe something like this could be implemented inside BoTorch (when fitting a model or optimizing an acquisition function).
This would be an improvement, but it would still cause issues in cases when combining other gpytorch models with botorch models, where you'd want to be more selective about the options.
In an ideal world, this could be selected quite precisely around the operations where this matters (e.g. computing the log likelihood or the posterior covariance of a particular GP model) - the downside of that is that we'll likely have to slap a bunch of these context managers around the code in all kinds of places. And the probability of missing this at least in some places is presumably somewhere around 1.
I'd be happy to change this if we find a good design that has decent tradeoffs in that regard. Curious if @jandylin has any thoughts on this since he's used GPyTorch with iterative solves in the BoTorch context quite a bit.
Also worth noting that a lot of this "approximate math by default" behavior is planned to go away with GPyTorch 2.0 and become opt-in only. I don't know how active 2.0 development is though. cc @JonathanWenger
Currently a bit on pause, but it's on the docket. Definitely should be opt-in in the future.