hyperas
hyperas copied to clipboard
use hyperas to tune learning rate
Hi, first of all, this library is awesome!!! Thanks for your contributions. I am just wondering if there is a way to use hyperas tuning learning rate since I have not found related code in the examples.
Thanks!
Hey @pxlong, thanks for the compliment and apologies for not coming back to you earlier. The philosophy is that you can use double-brace templates literally everywhere you want in the model-providing function, in particular the learning rate and other choices for optimisers.
Hey @pxlong, did you manage to tune the learning rate?
I tried
model.compile(loss='binary_crossentropy', metrics=['accuracy'],
optimizer={{choice([RMSprop(lr={{uniform(0,1)}}), Adam(lr={{uniform(0,1)}}), SGD(lr={{uniform(0, 1)}})])}})
but this returned
File "temp_model.py", line 123 optimizer=space['optimizer']), Adam(lr=space['lr']), SGD(lr=space['dropout_U_2']))}}) ^ SyntaxError: invalid syntax
How can i do better?
Hey @davidlenz , did you find a solution to your problem?
Did you try only for RMS or Adam to see if you have the same problem? If you have the same problem for single optimizer, I would define optimizers separatly and then make a choice with strings
Hello @pianomanu, didnt get it to work yet, haven't had the time for much trying though. i did
adam = Adam(lr={{uniform(0,1)}})
model.compile(loss='binary_crossentropy', metrics=['accuracy'],
optimizer={{choice([adam])}})
but this says
C:\Users\gcfghh\Notebooks\temp_model.py in get_space()
NameError: name 'adam' is not defined
When i do
adam = Adam(lr={{uniform(0,1)}})
model.compile(loss='binary_crossentropy', metrics=['accuracy'],
optimizer={{choice(['adam'])}})
it works, but i guess this just takes the standard learning rate.
I'll keep you updated, let me know if you have any suggestions.
Hey,
I get this error if I do like you did, not exactly the same..
Codes/temp_model.py in get_space()
NameError: global name 'adam' is not defined
However this works for me :
adam=keras.optimizers.Adam(lr={{uniform(0,1)}})
model.compile(loss='mean_squared_error', optimizer=adam ,metrics=['mean_squared_error'])
I guess it's not what you want because you also want to compare optimizers but you can do it by hand: compile the model twice, compare performance of the two best runs and choose... But it isn't very practical.
Thanks a lot @pianomanu ! This is acutally enough of a workaround for my usecase.
I have also tried to use recently Hyperas for tuning learning rate. The runs progress nicely without any issues, but I never see the the 'best' lr-parameter value together with other optimal unit, optimizers, dropout values etc. Is this on purpose? How can I retrieve the optimal learning rate value?
Found it somewhere. Hope it'd be helpful
adam = keras.optimizers.Adam(lr={{choice([10**-3, 10**-2, 10**-1])}})
rmsprop = keras.optimizers.RMSprop(lr={{choice([10**-3, 10**-2, 10**-1])}})
sgd = keras.optimizers.SGD(lr={{choice([10**-3, 10**-2, 10**-1])}})
choiceval = {{choice(['adam', 'sgd', 'rmsprop'])}}
if choiceval == 'adam':
optim = adam
elif choiceval == 'rmsprop':
optim = rmsprop
else:
optim = sgd
model.compile(loss='categorical_crossentropy', metrics=['accuracy'],optimizer=optim)