keras-io
keras-io copied to clipboard
KerasTuner "Getting started": missing hp.conditional_scope()?
"Getting started with KerasTuner" prominently mentions conditional hyperparameters:
https://github.com/keras-team/keras-io/blob/0be9a38b96e8dd85bc525a74babb614f38b179c7/guides/keras_tuner/getting_started.py#L153
But are these "units_0"
, "units_1"
,"units_2"
really conditional? No parent_name
is set, directly or via hp.conditional_scope()
, so what tells the tuner that f"units_{i}"
shall not be explored if i >= hp.get("num_layers")
?
Moreover, the next layer (Dropout) is also conditioned on a hyperparameter, and its rate arg begs to be tuned, yet the code does not show how to condition dropout rate on the dropout boolean.
In summary, I can read this 800+ line doc, feel confident about conditional hyperparameters, but never actually learn about setting the parent.
@haifeng-jin