tune
tune copied to clipboard
Make Bayesian optimization more fault tolerant
The process stops if the GP fit fails or if a model fails. In the first case, we can just randomly sample and in the second, fit the GP to complete data.