cyclops
cyclops copied to clipboard
Add support for model hyperparameter tuning
The model wrappers have a method that is intended for tuning the model hyperparameters and returning the best model. The method has the following signature:
find_best(
self,
X: ArrayLike,
y: ArrayLike,
parameters: Union[Dict, List[Dict]],
metric: Union[str, Callable, Sequence, Dict] = None,
method: Literal["grid", "random"] = "grid",
**kwargs,
)
Currently, only the scikit-learn model wrapper SKModel
implements this method, and that implementation would benefit from the following improvements:
- Support using metrics from
cyclops.evaluate.metrics
in the hyperparameter search, potentially using thesklearn.metrics.make_scorer
method. - Handle data splits e.g. predefined split, split by percentage, cross-validation split etc.
- Support passing
group
andfit_params
arguments when callingclf.fit
.
The PyTorch model wrapper (PTModel
) should implement this method as well, with the same behaviour as the sklearn model wrapper.