Sait Cakmak

Results 226 comments of Sait Cakmak

> Is there a built-in way in Ax’s modular BoTorch backend to jointly model multiple outputs? Technically, yes -- though I haven't tried this in quite a while. If you...

Hi @leolin8806 & @leonardoguilhoto. I spent some time looking into this. There isn't a nice off-the-shelf way of doing this, but with some customization, I got something working. Note that...

I think this is great! What I had done was to take out linear operator and any parts of GPyTorch that we didn't need for ExactGPs and create a bare-bone...

The memory issue might be related to https://github.com/pytorch/botorch/issues/2310. Due to some odd broadcasting within PyTorch / GPyTorch, the ensemble models can consume excessive amounts of memory when evaluated with large...

The tensor shapes in that operation, particularly the shape of `x2` is pretty large. Do you know why the -2 / q-batch dimension is 1611? I am guessing it is...

> 32 × 50 = 1600 fantasy points I am confused about why these 1600 points end up in the q-batch (-2) dimension. When we look at the tensor shape...

After writing this, I noticed that I am looking at the latest code rather than the stable 0.14.0. `@average_over_ensemble_models` used to be part of `@t_batch_mode_transform`, now it is a separate...

Yes, that'd be the correct input batch shape. You're adding additional batch dimensions (due to fantasies) to an already batched model (fully bayesian), so it is natural to run into...

Hi @suttergustavo. The error goes away if you specify `validate_task_values=False` when constructing the model. However, the behavior might be slightly different from what you had prior to 0.16.0 (https://github.com/meta-pytorch/botorch/pull/2960 specifically)....