Hyperactive icon indicating copy to clipboard operation
Hyperactive copied to clipboard

[ENH] hyperband algorithm

Open SimonBlanke opened this issue 7 months ago • 6 comments

Hyperband is a hyperparameter optimization algorithm that improves efficiency by dynamically allocating computational resources. It is based on the Successive Halving Algorithm (SHA) but introduces an adaptive mechanism to balance exploration (trying many configurations) and exploitation (allocating resources to promising candidates).

SimonBlanke avatar May 16 '25 06:05 SimonBlanke

I would advise to think about what type of algorithm this is - it is not just a simple optimization algorithm since it is applied to an incremental learner, i.e., the objective changes over time.

I also cannot immediately place it - it might be some variant of active or reinforcement learning.

fkiraly avatar May 17 '25 07:05 fkiraly

@SimonBlanke Sir , I applied for this project under ESoC but haven’t received any reply yet. I wanted to ask whether it has already been allotted?

aryan0931 avatar May 18 '25 10:05 aryan0931

Hello @aryan0931, yes I have assigned this issue. There are multiple contributors interested in the same projects. I am now working on providing fitting projects to everyone. But this is a discussion, that should be done on our discord server.

SimonBlanke avatar May 21 '25 04:05 SimonBlanke

Hi @SimonBlanke @fkiraly I'd like to implement Hyperband as a new optimizer in Hyperactive with the following approach:

Extending BaseOptimizer directly (rather than GFO adapter) for better control over multi-fidelity evaluation:

# src/hyperactive/opt/_hyperband.py
class Hyperband(BaseOptimizer):
    def __init__(self, max_resource=81, eta=3, **kwargs):
        self.max_resource = max_resource  # R: maximum resource budget
        self.eta = eta                    # elimination factor
        super().__init__(**kwargs)

    def _solve(self, experiment, **search_config):
        # Run multiple successive halving brackets with different n/r trade-offs
        # Each bracket: sample configs → evaluate with increasing resources → eliminate worst performers
        return best_config_across_all_brackets

For Multi-Fidelity Support we can use Resource parameter injection:

# Inject resource as parameter (e.g., n_epochs, max_iter)
params_with_resource = params.copy()
params_with_resource['n_epochs'] = int(resource_budget)
score = experiment.score(params_with_resource)

This approach would maintain Hyperactive's experiment-based architecture while adding efficient resource allocation for expensive hyperparameter optimization. If this issue is open to take up, I would like to work on this. Thanks!

rohansen856 avatar Nov 18 '25 10:11 rohansen856

The implementation of the Hyperband algorithm is a very challenging task. It will require a conclusive interface design, that has to be discussed with @fkiraly and me beforehand. So the next step would be to establish a task group, that finds an API design that fits into Hyperactive's unified interface. I will take care of this within the next few weeks.

SimonBlanke avatar Nov 27 '25 07:11 SimonBlanke

maybe a silly question: could you kindly remind me, @SimonBlanke, why this does not fit the BaseOptimizer design?

Is it an active learning algorithm?

It is maybe worth noting that any active learning algorithm can be run as a pure optimization algorithm, if it accepts a fully discrete domain - since the latter can be chosen to be the training dataset.

fkiraly avatar Nov 28 '25 00:11 fkiraly