Hyperactive
Hyperactive copied to clipboard
An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
I would like to introduce a new feature to Hyperactive to **chain together** multiple optimization algorithms. This will be called an **Optimization Strategy** in the future. The API for this...
## Explanation It would be very useful if Hyperactive has the ability to **save** the optimization backend (via pickle, dill, cloudpickle, ...) to disk and **load** it later into Hyperactive...
An interesting example for gradient-free-optimization is fitting one or multiple gauss functions to data. The data can be generated with numpy for this example. A "real world" example of this...
Since Hyperactive allows a lot of flexibility in creating the objective function, the optimization of hyperparameters, cost function or structure of a siamese network should be possible.
Since Hyperactive is able to perform neural architecture search it would be interesting to have an example of the optimization of a residual neural network. It would then be possible...
The popular python package [ray](https://github.com/ray-project/ray) has a multiprocessing feature that could be used to run optimization-processes in parallel: ```python from ray.util.multiprocessing import Pool def f(index): return index pool = Pool()...
[As discussed in this PR](https://github.com/SimonBlanke/Hyperactive/pull/58): It would be interesting to see how the newest versions of python (v3.11) speed up different optimization tasks.
This adds a way to change the parameters of the optimization algorithms during runtime (e.g. `epsilon` from the hill-climbing optimizer). My idea is to enable this within the objective function....