Batch (Multi-trials) evaluation in Service API performance
Hello,
As proposed in a precedent issue, I'm using parameters, trial_index = ax_client.get_next_trials(max_trials=5) in Service API to get 5 trials at the same time and evaluate them (the generation strategy is set to Models.GPEI).
Is it equivalent to using batch Expected Improvement in BOTorch in terms of performance (convergence speed of the optimization) ? (Reading the comments in the BOTorch tutorial, it seems not ) If not, how can I use batch Expected Improvement in Service API ?
Thanks
Yes, batch EI is used under the hood when doing parallel optimization in ax. In particular, we use qNEI from BoTorch.
On Sun, Jul 10, 2022 at 5:21 AM Kh-im @.***> wrote:
Hello,
As proposed in a precedent issue, I'm using parameters, trial_index = ax_client.get_next_trials(max_trials=5) in Service API to get 5 trials at the same time and evaluate them (the generation strategy is set to Models.GPEI).
Is it equivalent to using batch Expected Improvement in BOTorch in terms of performance (convergence speed of the optimization) ? (Reading the comments in the BOTorch tutorial, it seems not ) If not, how can I use batch Expected Improvement in Service API ?
Thanks
— Reply to this email directly, view it on GitHub https://github.com/facebook/Ax/issues/1019, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAAW34INRNIB76XVO3HQBBLVTKPZLANCNFSM53EZIOUQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>
That's great ! Thanks for your answer
The default in the service API is actually to use EI with sequential conditioning rather than solving the full d * q (where q is the number of trials) dimensional problem as in qNEI. This has regret guarantees and usually works better in practice due the optimization problem being easier.
This seems resolved; please reopen if you have any follow-ups, @Kh-im!