MetaBO icon indicating copy to clipboard operation
MetaBO copied to clipboard

requirement for precomputed gaussian process hyperparameters for test datasets

Open vamp-ire-tap opened this issue 3 years ago • 2 comments

Dear authors,

Apologies for the opening of too many issues already, unfortunately, there is another query I have regarding the method. I would be grateful for your response.

For the evaluation to work, we need to provide the "gp_hyperparameters.pkl" file. This file has precomputed test dataset gaussian process hyperparameters such as lengthscale, variance and noise variance. These are loaded in the environment as shown below:

           # load gp-hyperparameters
            self.kernel_lengthscale = self.hpo_gp_hyperparameters[dataset]["lengthscale"]
            self.kernel_variance = self.hpo_gp_hyperparameters[dataset]["variance"]
            self.noise_variance = self.hpo_gp_hyperparameters[dataset]["noise_variance"]

I do not understand as to why there is a requirement of a pre-trained GP "model" (and not just hyperparameters) on test dataset beforehand for the methods MetaBO and TAF both to be evaluated on test dataset?

In order to generate these GP hyperparameters ("gp_hyperparameters.pkl" ) I use the following code, where "X" and "y" are from the "objectives.pkl", the meta-data involving hyperparameter configurations and their responses iteratively for each dataset(training and test datasets).

    kernel = GPy.kern.RBF(X.shape[1], ARD=1)
    m = GPy.models.GPRegression(X,y,kernel)
    m.optimize('bfgs', max_iters=200)
    gp_hyperparameters[dataset]={'lengthscale': np.array(m['.*lengthscale']), 'variance': np.array(m['rbf.*variance']), 'noise_variance': np.array(m['.*_noise.variance'])}

Can you also provide feedback as to whether this is how the "gp_hyperparameters.pkl" file was to be generated, because from my perspective this is a trained GP model now and not just hyperparameters of a GP model.

vamp-ire-tap avatar Mar 08 '21 17:03 vamp-ire-tap