Eytan Bakshy

Results 27 comments of Eytan Bakshy

Yes, batch EI is used under the hood when doing parallel optimization in ax. In particular, we use qNEI from BoTorch. On Sun, Jul 10, 2022 at 5:21 AM Kh-im...

@bletham, you might find this problem interesting...

Hi Nick, How were you planning on using this gradient information in your model and acquisition function? @lena-kashtelyan might have the best idea of whether there might be convenient ways...

Yes, at some point we eliminated the magic variable, in experiment. Returning false is the way folks should be disabling logging. Sent from my iPhone > On Jan 24, 2020,...

I know that Max is out for the next week. BoTorch has support for MTGPs with fixed noise... would something like https://botorch.org/v/0.1.0/api/models.html#fixednoisemultitaskgp help? On Mon, Jan 6, 2020 at 7:46...

IIRC @danielrjiang and @Balandat had done some experiments w/ KG + hetGP models using a GP to model the variance. They might be able to give some advice sometime after...

are there any cases where we wouldn't want to sample_around_best? if it's so commonly useful for AF optimization why hide it in a options blob? On Wed, Feb 9, 2022...

Hi @mc-robinson, +1 to what @Balandat said—this is a very nice writeup and it would be awesome to have it in our main BoTorch tutorials. Happy to provide feedback and...

One could add an acquisition function that just maximizes posterior variance. This is often used as a baseline in active learning papers and generally performs the worst. I would highly...