keras-tuner
keras-tuner copied to clipboard
Conditional Hyperparameters Not Displayed in TensorBoard for First Occurrence
Describe the bug The KerasTuner TensorBoard "HPARAMS" display does not show conditional hyperparameters used for the first trial when they are implemented with a conditional scope. The KerasTuner output also does not show the values chosen for that trial.
To Reproduce
Use any conditional scope hyperparameter in any hypermodel and look at the Tensorboard log. The attached .zip files verify that the conditional hyperparameters, adv_step_size
and multiplier
were both set at 0.01 and 0.01 (the lower bounds of their hp.Float
ranges) but not recorded at the beginning of the trial, only at the end. This occurred with a random tuner during the initial phase of Bayesian tuning.
Expected behavior KerasTuner and Tensorboard display values for hyperparameters chosen in the first instance where a conditional scope implements them. The attached picture shows TensorBoard "HPARAMS" view with buggy behavior.
Would you like to help us fix it? No, thank you.
Describe the bug The KerasTuner TensorBoard "HPARAMS" display does not show conditional hyperparameters used for the first trial when they are implemented with a conditional scope. The KerasTuner output also does not show the values chosen for that trial.
To Reproduce Use any conditional scope hyperparameter in any hypermodel and look at the Tensorboard log. Attached picture shows what you will see; the attached .zip file and observe that the conditional hyperparameters,
adv_step_size
andmultiplier
were both set at 0.01 and 0.01 (the lower bounds of theirhp.Float
ranges). This occurred with a random tuner during the initial phase of Bayesian tuning.Expected behavior KerasTuner and Tensorboard display values for hyperparameters chosen in the first instance where a conditional scope implements them.
Would you like to help us fix it? No, thank you.
Here is a screen shot showing the command line output of KerasTuner. The buggy behavior is highlighted in red.
Hi, currently I am experiencing the same problem, the dependant hyperparameter sets the value of the lowest bound (0)
and it is not registered on keras.
This is the code i was trying to implement
# CNN
CNNFilters = hp.Int("CNNFilters", min_value=0, max_value=50, step=10)
CNNKernel = hp.Int("CNNKernel", min_value=0, max_value=self._init.window -1, step=1)
if CNNFilters > 0 and CNNKernel > 0:
# do something
# SkipGRU
with hp.conditional_scope("CNNKernel", CNNKernel):
skip = hp.Int("SkipGRU", min_value=0, max_value=(self._init.window-CNNKernel-1), step=5)