Is it possible to use CheckpointSaver with BayesSearchCV?
As this question hasn't generated any answers and OP didn't provide any details, let me try to elaborate on what I think was meant. I'm trying to optimize some hyperparameters of a classifier. In case the process is interupted, a checkpoint saver is used, similarly to how it's done in the documentation:
from skopt import BayesSearchCV
from skopt.space import Real, Categorical, Integer
from sklearn.datasets import load_iris
from sklearn.svm import SVC
from sklearn.model_selection import train_test_split
from skopt.callbacks import CheckpointSaver
X, y = load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y,
train_size=0.75,
random_state=0)
opt = BayesSearchCV(
SVC(),
{
'C': Real(1e-6, 1e+6, prior='log-uniform'),
'gamma': Real(1e-6, 1e+1, prior='log-uniform'),
'degree': Integer(1,8),
'kernel': Categorical(['linear', 'poly', 'rbf']),
},
n_iter=1000000,
random_state=0
)
checkpoint_saver = CheckpointSaver("./checkpoint.pkl", compress=9)
opt.fit(X_train, y_train, callback=[checkpoint_saver])
I run this in a notebook, and after a while I interrupt the kernel. The last checkpoint can now be restored:
from skopt import load
res = load('./checkpoint.pkl')
But how can I use this restored checkpoint to continue fitting? The documentation only shows how to do it with the gp_minimize function.
Thanks for the added detail, that's exactly what I was getting at. Perhaps this functionality is not available
For completeness in case we get any answers on Stack Overflow and not here, including this link for completeness:
https://stackoverflow.com/questions/73344991/how-to-restart-bayessearchcv-from-a-checkpoint
After looking through the code for a bit, my initial impression is the same as @davidfstein's, the functionality just isn't there. @matthiashhh @davidfstein if you came up with a solution, could you post it here? It would be helpful for future users.