hyperband icon indicating copy to clipboard operation
hyperband copied to clipboard

No dropout in last hidden layer?

Open j6e opened this issue 8 years ago • 1 comments

I've been working with your code lately and I've notice that the last layer of the keras_mlp.py in both models does never apply dropout:

model = Sequential()
model.add( Dense( params['layer_1_size'], init = params['init'], 
activation = params['layer_1_activation'], input_dim = input_dim ))

for i in range( int( params['n_layers'] ) - 1 ):
	
	extras = 'layer_{}_extras'.format( i + 1 )
	
	if params[extras]['name'] == 'dropout':
		model.add( Dropout( params[extras]['rate'] ))
	elif params[extras]['name'] == 'batchnorm':
		model.add( BatchNorm())
		
	model.add( Dense( params['layer_{}_size'.format( i + 2 )], init = params['init'], 
		activation = params['layer_{}_activation'.format( i + 2 )]))
	   
model.add( Dense( 1, init = params['init'], activation = 'linear' ))

As can be seen in the code, the last hidden layer can't have dropout since the dropout is coded before the layer itself. Is this intentional or it's undesired behaviour?

j6e avatar Nov 23 '17 13:11 j6e

It is probably a bug, yeah. Keras docs show dropout applied before the last layer: https://keras.io/getting-started/sequential-model-guide/

zygmuntz avatar Nov 23 '17 18:11 zygmuntz