mealpy icon indicating copy to clipboard operation
mealpy copied to clipboard

[BUG]: BaseGA parameters don't contain all available parameters

Open muraci opened this issue 1 year ago • 1 comments

Description of the bug

When using BaseGA from the Genetic Algorithm (GA) library for hyperparameter tuning, it appears that not all available parameters are included in the set_parameters method in the GA.py file. This limitation prevents the utilization of extended parameter sets such as selection, crossover, and mutation.

Including these parameters aligns with functionality provided by similar libraries and improves the usability of the BaseGA optimizer for advanced hyperparameter tuning tasks.

Steps To Reproduce

from opfunu.cec_based.cec2017 import F52017
from mealpy import FloatVar, GA, Tuner  

f1 = F52017(30, f_bias=0)

p1 = {
    "bounds": FloatVar(lb=f1.lb, ub=f1.ub),
    "obj_func": f1.evaluate,
    "minmax": "min",
    "name": "F5",
    "log_to": "console",
}

paras_ga_grid = {
    "epoch": [50],
    "pop_size": [50],
    "pc": [0.85, 0.90],
    "pm": [0.01, 0.02],
    "selection": ["tournament", "roulette", "random"],
    "crossover": ["one_point", "multi_points", "uniform", "arithmetic"],
    "mutation": ["flip", "swap"]
}

if __name__ == "__main__":
    model = GA.BaseGA()
    tuner = Tuner(model, paras_ga_grid)

    tuner.execute(problem=p1, termination=None, n_trials=5, verbose=True)

    print(tuner.best_row)

Additional Information

Observed Behavior

The following error occurs: ValueError: Invalid input parameters: {'crossover', 'pop_size', 'epoch', 'pc', 'pm', 'selection', 'mutation'} for BaseGA optimizer. Valid parameters are: {'pc', 'pm', 'epoch', 'pop_size'}.

In the GA.py file, line 82 currently defines the parameters as: self.set_parameters(["epoch", "pop_size", "pc", "pm"]) This line could be updated to include additional parameters, as shown below: self.set_parameters(["epoch", "pop_size", "pc", "pm", "selection", "crossover", "mutation", "k_way"])

muraci avatar Dec 16 '24 14:12 muraci

Hi @muraci,

Sorry for the late reply. It will be fixed in the next updated version. At the first designation, I thought other parameters should not be the official parameters of GA. That is why I did not add them to the dict variable. I'm considering making GA more flexible by divided it into multiple components like I did with multi-objective GA version (https://github.com/thieu1995/MetaMoo/blob/main/examples/exam_nsga2.py). But i'm worrying that it will be hard to do the hyper-parameter tuning process. Or I can just add these parameters to variables set like you request, that will solve the problem. But it will not be flexible as above mthod. I'm really hesitating between these two options.

thieu1995 avatar Jan 05 '25 11:01 thieu1995

@muraci, I added another class OriginalGA in latest version. You can use this class to tune all hyperparameters.

thieu1995 avatar Aug 16 '25 22:08 thieu1995