hyperas icon indicating copy to clipboard operation
hyperas copied to clipboard

How would you suggest to using hyperas to get the best pairs of hyper-param?

Open twangnh opened this issue 8 years ago • 7 comments

Hi! maxpumperla, Recently, I have tried hyperas on training LSTM for character-level text generation, but it seems that what hypers get is not the best of hyper-param pairs ,so how would you suggest on using it? For, example, should we group which pairs to optimize and others, you know, I mean a whole process. Thanks in advance!

twangnh avatar May 26 '16 13:05 twangnh

That's a tough question, open research actually! It's to be expected that hyperas doesn't always find good solutions, since usually the space to explore is very high-dimensional. From a theoretical point of view TPE should yield the best results, followed by random search. Other than that you have to see what your specific problem requires.

I struggle with this type of question for my own projects as well, so at least you're in good company. :)

maxpumperla avatar Jul 26 '16 09:07 maxpumperla

Is there a way to return the best_run's actual parameter values instead of their indices , eg. in {{choice([a,b])}} 0 or 1? Unfortunately I cannot use space_eval() from hyperopt's fmin module. @maxpumperla

ben0it8 avatar Apr 20 '17 11:04 ben0it8

@ben0it8 Check the complex.py example. The parameters are returned in best_run, to print them use e.g.

print("Parameters of best run", best_run)

EDIT That's just working for non-choice cases.

pkainz avatar May 29 '17 10:05 pkainz

@ben0it8 This issue has been addressed in PR #104 and should resolve your problem.

pkainz avatar Jun 19 '17 13:06 pkainz

@pkainz After run this complex.py example, I got this result:

 {'Dropout': 0.03323327852409652, 'Dense': 2, 'Dropout_1': 0.0886198698550964, 'add': 1, 'conditional': 1, 'batch_size': 0, 'optimizer': 0, 'Activation': 1}

It that correct? My hyperas version is 0.4 and I want to get the real value for Dense and Activation. How can I do that? Thanks

resuly avatar Aug 03 '17 15:08 resuly

@resuly To get the real value, you need to evaluate the hyperparameter space.

Option 1 - from scratch
Change the call to the optimization function to this one:

best_run, best_model, space = optim.minimize(
    model=model,
    data=data,
    algo=tpe.suggest,
    max_evals=5,
    trials=Trials(),
    eval_space=True,   # <-- this is the line that puts real values into 'best_run'
    return_space=True  # <-- this allows you to save the space for later evaluations 
) 

Option 2 - after having run the optimizations You need access to your hyperparameter space that gets created by the hyperopt package and the parameter dict (the ones you posted in your last comment). Then you can use the function hyperas.utils.eval_hyperopt_space(space, vals) to extract the real values, e.g.:

from hyperas.utils import eval_hyperopt_space
real_param_values = eval_hyperopt_space(space, best_run)

pkainz avatar Aug 07 '17 11:08 pkainz

@pkainz Thank you so much.

resuly avatar Aug 08 '17 06:08 resuly