Max Balandat
Max Balandat
Hmm this should not happen. It might be related to a memory leak we've seen in the past, but weren't able to isolate so far: #641. I don't see any...
Not sure what's going on there, but my first guess would be that this has something to do with the batch-model-to-single-model-to-batch-model conversion that we're doing by default during the fitting:...
> That's a weird issue. I can't seem to replicate it though - might be isolated to linux, I tried on mac. Huh that is interesting...
Did you figure out what’s wrong here?
> This seems to be fixed on main because the mins shape is 1 x 1 as expected. Which main? Botorch or gpytorch? I recently landed a fix to gpytorch's...
Hmm, then I don't know what's going on here...
Thanks for flagging this. I *think* I know what's going on, it looks like this issue was introduced in #804. Let me see how hard it is to fix this.
Hi @samueljamesbell, thanks for your interest in using `qNegIntegratedPosteriorVariance` in Ax. Most of this should be pretty straightforward, you'd essentially add a clone of https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L448-L483 to `botorch/acquisition/input_constructors.py`, but instead of...
> Is it a normal behavior or is the function somehow ill defined? It seems like you're running into some numerical issues b/c of very ill-conditioned kernel matrices. This isn't...
Thanks for the observation and raising this point. I believe when it comes to modeling constraints (rather than defining test problems), our convention is relatively consistent that negative value of...