test-tube
test-tube copied to clipboard
Proposal: Independent Local hyperparameter optimization module
Right now HyperOptArgumentParser
contains much of the logic for doing a hyperparamter search on a local machine.
https://github.com/williamFalcon/test-tube/blob/master/test_tube/argparse_hopt.py#L259
Why this it not great:
- This is the opposite of the SLURM code where the
HyperOptArgumentParser
object is passed intoSlurmCluster
. - Code duplication and entanglement.
- Hard to test
HyperOptArgumentParser
independent of the mechanism of deployment.
Proposed change:
Having something like Local
or LocalSystem
object that similar to SlurmCluster
accepts a HyperOptArgumentParser
that can be used to optimize hyperparameter locally:
hyperparams = parser.parse_args()
# Enable cluster training.
system = LocalSystem(
hyperparam_optimizer=hyperparams,
log_path=hyperparams.log_path,
python_cmd='python3',
test_tube_exp_name=hyperparams.test_tube_exp_name
)
system.max_cpus = 100
system.max_gpus = 5
# Each hyperparameter combination will use 200 cpus.
system.optimize_parallel_cpu(
# Function to execute:
train,
# Number of hyperparameter combinations to search:
nb_trials=24')
Downsides
- Probably breaks backward compatibility.