hyperas icon indicating copy to clipboard operation
hyperas copied to clipboard

use hyperas to tune learning rate

Open pxlong opened this issue 8 years ago • 8 comments

Hi, first of all, this library is awesome!!! Thanks for your contributions. I am just wondering if there is a way to use hyperas tuning learning rate since I have not found related code in the examples.

Thanks!

pxlong avatar Jun 22 '16 18:06 pxlong

Hey @pxlong, thanks for the compliment and apologies for not coming back to you earlier. The philosophy is that you can use double-brace templates literally everywhere you want in the model-providing function, in particular the learning rate and other choices for optimisers.

maxpumperla avatar Jul 26 '16 07:07 maxpumperla

Hey @pxlong, did you manage to tune the learning rate?

I tried

model.compile(loss='binary_crossentropy', metrics=['accuracy'],
                  optimizer={{choice([RMSprop(lr={{uniform(0,1)}}), Adam(lr={{uniform(0,1)}}), SGD(lr={{uniform(0, 1)}})])}})

but this returned

File "temp_model.py", line 123 optimizer=space['optimizer']), Adam(lr=space['lr']), SGD(lr=space['dropout_U_2']))}}) ^ SyntaxError: invalid syntax

How can i do better?

davidlenz avatar Feb 18 '17 19:02 davidlenz

Hey @davidlenz , did you find a solution to your problem?

Did you try only for RMS or Adam to see if you have the same problem? If you have the same problem for single optimizer, I would define optimizers separatly and then make a choice with strings

mpariente avatar Feb 20 '17 15:02 mpariente

Hello @pianomanu, didnt get it to work yet, haven't had the time for much trying though. i did

   adam = Adam(lr={{uniform(0,1)}})
   model.compile(loss='binary_crossentropy', metrics=['accuracy'],
                  optimizer={{choice([adam])}}) 

but this says

C:\Users\gcfghh\Notebooks\temp_model.py in get_space()

NameError: name 'adam' is not defined

When i do

adam = Adam(lr={{uniform(0,1)}})
model.compile(loss='binary_crossentropy', metrics=['accuracy'],
                       optimizer={{choice(['adam'])}}) 

it works, but i guess this just takes the standard learning rate.

I'll keep you updated, let me know if you have any suggestions.

davidlenz avatar Feb 24 '17 16:02 davidlenz

Hey,

I get this error if I do like you did, not exactly the same..

Codes/temp_model.py in get_space()
NameError: global name 'adam' is not defined

However this works for me :

adam=keras.optimizers.Adam(lr={{uniform(0,1)}})
model.compile(loss='mean_squared_error', optimizer=adam ,metrics=['mean_squared_error'])

I guess it's not what you want because you also want to compare optimizers but you can do it by hand: compile the model twice, compare performance of the two best runs and choose... But it isn't very practical.

mpariente avatar Feb 26 '17 12:02 mpariente

Thanks a lot @pianomanu ! This is acutally enough of a workaround for my usecase.

davidlenz avatar Apr 13 '17 11:04 davidlenz

I have also tried to use recently Hyperas for tuning learning rate. The runs progress nicely without any issues, but I never see the the 'best' lr-parameter value together with other optimal unit, optimizers, dropout values etc. Is this on purpose? How can I retrieve the optimal learning rate value?

Dahlasam avatar Apr 27 '18 10:04 Dahlasam

Found it somewhere. Hope it'd be helpful

adam = keras.optimizers.Adam(lr={{choice([10**-3, 10**-2, 10**-1])}})
rmsprop = keras.optimizers.RMSprop(lr={{choice([10**-3, 10**-2, 10**-1])}})
sgd = keras.optimizers.SGD(lr={{choice([10**-3, 10**-2, 10**-1])}})

choiceval = {{choice(['adam', 'sgd', 'rmsprop'])}}
if choiceval == 'adam':
    optim = adam
elif choiceval == 'rmsprop':
    optim = rmsprop
else:
    optim = sgd
    
model.compile(loss='categorical_crossentropy', metrics=['accuracy'],optimizer=optim)

alinisarhaider avatar May 14 '19 06:05 alinisarhaider