prompt2model icon indicating copy to clipboard operation
prompt2model copied to clipboard

Add automated hyperparameter selection

Open viswavi opened this issue 1 year ago • 0 comments

Our current Prompt2Model pipeline uses a fixed set of hyperparameters for all tasks (shown here).

To robustly handle different tasks, we want to implement automated hyperparameter selection by computing metrics on the validation splits of the retrieved and generated datasets for various configurations of a given model. Once put in place, our architecture diagram will look like this (unimplemented components are in blue): Prompt-to-deployment architecture

There are two primary design decisions required in implementing this component:

  1. What is the space of hyperparameters to choose from? We could define a default space of parameters (e.g. learning rate between 1e-3 and 1e-6, optimizer should be one of AdamW, Adam, or SGD w/ momentum). If we wanted to be more exploratory, we could even ask an LLM to suggest a space of parameters to consider for this task. We could also include the choice of base model to finetune as a hyperparameter (e.g. try the top-5 model architectures returned by the model retriever, and choose the one with the best validation metrics)
  2. How do we select the best hyperparameters on the validation data? To avoid doing extra work, we could use a library like Hyperopt to do this. As a simple, hand-rolled option, we could also just do "random search" where we sample random configurations from the given space of hyperparameters and choose the configuration with the greatest validation metrics.

viswavi avatar Aug 30 '23 16:08 viswavi