Hyperactive
Hyperactive copied to clipboard
An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
It would be nice to have an estimator level integration with detectors in the `sktime` package. https://github.com/sktime/sktime This would require: * a `BaseExperiment` descendant class `SktimeDetectorExperiment` which carries out a...
It would be nice to have an estimator level integration with detectors in the `skforecast` package. https://github.com/skforecast/skforecast This would require: * a `BaseExperiment` descendant class `SkforecastExperiment` which carries out a...
Fix for issue #123 For gfo to allow numpy coercibles and sklearn parameter grids via `sklearn.model_selection.ParameterGrid` as search spaces.
This PR aims to address #158 and plans to add callback hooks to `BaseExperiment.evaluate` and `BaseOptimizer.solve`, configurable via `set_config`. Callback hooks: - callbacks_pre / callbacks_post for experiment evaluations - callbacks_pre_solve...
The `TorchExperiment` assumes all metrics should be minimized (lower is better) by default. This can be seen [here in this line](https://github.com/SimonBlanke/Hyperactive/blob/main/src/hyperactive/experiment/integrations/torch_lightning_experiment.py#L114). However, many metrics should be maximized (accuracy, F1, AUC),...
The metadata dictionary in TorchExperiment._evaluate() was always empty. This fix populates it with useful training information: - num_epochs_trained: The number of epochs the model was trained for - all_metrics: All...
The `TorchExperiment._evaluate()` method returns a metadata dictionary, but it's always empty `{}`. See [this line](https://github.com/SimonBlanke/Hyperactive/blob/main/src/hyperactive/experiment/integrations/torch_lightning_experiment.py#L180) (the empty dict is never updated). This is inconsistent with other Hyperactive integrations (sklearn, sktime)...
I think there is some content missing in the README feature table: * middle column, AI/ML integrations is missing the `torch` experiment and optimization * I think the third/right column...
To Resolve #136 This PR adds automatic categorical feature handling for GFO-based optimizers by encoding non-numeric search space dimensions to consecutive integers internally, then decoding them back when evaluating the...
The v4 `hyperactive` wrappers of GFO have a feature where they encode categorical features as consecutive integers - this kind of encoding is a desirable feature, potentially as a default....