keras-tuner icon indicating copy to clipboard operation
keras-tuner copied to clipboard

Unable to run tuner when trying to use two model class through subclassing

Open ShaheenPerveen opened this issue 2 years ago • 1 comments

Bug: Unable to run keras tuner search on using two modelclass objects to separate model.compile

Reason: To satisfy some code policy I am creating two model class, one to define the HyperModel and another one to call the HyperModel and compile the model. On passing the second ModelClass object, tuner is not throwing any error, but it doesn't run any trial. Please refer to the code below for clarity

Code:

class BaseHypModel(HyperModel):
	def __init__(self, input_shape):
		self.input_shape = input_shape
	def build(self):
		hp = HyperParameters()
		hidden_units = hp.Choice('units',[10, 5])
		model = Sequential()
		model.add(Conv2D(64, kernel_size=3,activation=’relu’, input_shape=self.input_shape))
		model.add(Conv2D(32, kernel_size=3, activation=’relu’))
		model.add(Flatten())
		model.add(Dense(hidden_units))
		model.add(Dense(1, activation=’sigmoid’))
		return model
		
class HypModel(BaseHypModel):
	def __init__(self, input_shape):
		super().__init__(input_shape):
		
	def build(self, hp):
		model = super(HypModel, self).build()
		model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
		return model

inp_img_shape = (224, 224, 3)
ModelClass =  = HypModel(inp_img_shape)
max_trials = 4
tuner_epoch = 3

tuner = RandomSearch(ModelClass,
		objective="val_accuracy",
		max_trials=max_trials,
		executions_per_trial=1,
		overwrite=True,
		directory='myproject',
		project_name='randomsearch')
            
tuner.search(train_generator,
		steps_per_epoch=n_steps,
		epochs=tuner_epoch,
		validation_data=val_generator,
		validation_steps=val_steps)

best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
print('best hps {}'.format(best_hps.values))

Expected behavior I was expecting this code to run 4 trials, however I get this: Search: Running Trial 1

default configuration Epoch 1/3 25/25 [==============================] - 5s 203ms/step - loss: 0.6667 - accuracy: 0.6363 - val_loss: 0.6390 - val_accuracy: 0.7042 Epoch 2/3 25/25 [==============================] - 4s 164ms/step - loss: 0.5121 - accuracy: 0.7638 - val_loss: 0.5902 - val_accuracy: 0.7637 Epoch 3/3 25/25 [==============================] - 4s 159ms/step - loss: 0.4129 - accuracy: 0.8500 - val_loss: 0.5417 - val_accuracy: 0.8005 Trial 1 Complete [00h 00m 57s] val_accuracy: 0.8005022406578064

Best val_accuracy So Far: 0.8005022406578064 best hps {}

Additional context The search just ends after one trial

Would you like to help us fix it? Could anyone suggest the correct way to do this? Where am I making the mistake?

ShaheenPerveen avatar Mar 04 '22 06:03 ShaheenPerveen

The way you are declaring the hyperparameters is incorrect. Instead, you should pass the hp object from the HypModel class.

In BaseHypModel instead have

def build(self, hp):
    hidden_units = hp.Choice('units', [10, 5])
    ...

(i.e. do not call HyperParameters() directly here)

and in HypModel have

def build(self, hp):
    model = super(HypModel, self).build(hp)
    ...

Note for the example you've posted here, that will still only run for 2 trials (as the search space will be exhuasted after two trials, as there are only 2 possible hyperparameter combinations to try).

brydon avatar Mar 19 '22 03:03 brydon