Ax icon indicating copy to clipboard operation
Ax copied to clipboard

[GENERAL SUPPORT]: SEBO with parameter constraints

Open souravdey94 opened this issue 1 year ago • 1 comments
trafficstars

Question

I am trying the predict chemical reaction rates in different solvent combinations. I want to use SEBO because the parameter space can contain upto 30 solvents and in most cases the there are only 3 to 4 important solvents. Since, it is a composition problem, I need to use parameter constraints. But SEBO with parameter constraint is not implemented in Ax. Can you suggest me a work around?

I have added a code snippet of the generation strategy and experiment section.

Please provide any relevant code snippet if applicable.

length = len(solvent_names_minus1)
    print('length', length)

    torch.manual_seed(12345)  # To always get the same Sobol points
    tkwargs = {
    "dtype": torch.double,
    "device": torch.device("cuda" if torch.cuda.is_available() else "cpu"),
    }

    target_point = torch.tensor([0 for _ in range(length)], **tkwargs)
    print('target_point', target_point)

    SURROGATE_CLASS = SaasFullyBayesianSingleTaskGP


ax_client.create_experiment(
        name="solventproject",
 
        
        parameters=[
            {
                "name": solvent_names_minus1[i],
                "type": "range",
                "bounds": [float(range_min_minus1[i]), float(range_max_minus1[i])],
                "value_type": "float",  # Optional, defaults to inference from type of "bounds".
                "log_scale": False,  # Optional, defaults to False.
            }
            for i in range(len(solvent_names_minus1))
        ],
        objectives={"blend_score": ObjectiveProperties(minimize=False)},
        parameter_constraints=[sum_str],  # Optional.
        outcome_constraints=["lnorm <= 0.00"],  # Optional.
    )


gs = GenerationStrategy(
    name="SEBO_L0",
    steps=[
     
        GenerationStep(  # BayesOpt step
            model=Models.BOTORCH_MODULAR,
            # No limit on how many generator runs will be produced
            num_trials=-1,
            model_kwargs={  # Kwargs to pass to `BoTorchModel.__init__`
                "surrogate": Surrogate(botorch_model_class=SURROGATE_CLASS),
                "acquisition_class": SEBOAcquisition,
                "botorch_acqf_class": qNoisyExpectedHypervolumeImprovement,
                "acquisition_options": {
                    "penalty": "L0_norm", # it can be L0_norm or L1_norm.
                    "target_point": target_point, 
                    "sparsity_threshold": length,
                },
            },
        )
    ]
)

Code of Conduct

  • [X] I agree to follow this Ax's Code of Conduct

souravdey94 avatar Sep 26 '24 17:09 souravdey94

in most cases the there are only 3 to 4 important solvents

Is it bad if the suggest arms include more than 3-4 or is this just prior knowledge you want to include? Note: using a SAAS model already encodes the prior the only a few parameters are relevant, so unless you specifically want to avoid generating arms that change many parameters, sparse BO is probably not needed.

Regarding using sparse BO, it looks like optimizing the L0 objective using homotopy does not support parameter constraints. There isn't a fundamental reason by one couldn't though. Some options would be:

  1. Use L1_norm instead of L0_norm. This may not lead to the most sparse results, but can be used out of the box. (https://github.com/facebook/Ax/blob/a144287172e74bb5b10376548d0482d5a0ff3507/ax/models/torch/botorch_modular/sebo.py#L241-L265)
  2. implement support for parameter constraints in optimize_with_homotopy (https://github.com/facebook/Ax/blob/a144287172e74bb5b10376548d0482d5a0ff3507/ax/models/torch/botorch_modular/sebo.py#L277)
  3. Allow setting a fixed parameter value in differentiable relaxation for the L0 norm, and optimize without homotopy. This would require adding another argument that allows one to differentiate what norm to use from how to optimize it, since currently these are coupled.

sdaulton avatar Sep 26 '24 21:09 sdaulton

https://github.com/pytorch/botorch/pull/2588/files extends _optimize_with_homotopy to include the constraints typically available in optimize_acqf which I believe addresses suggestion 2. here.

CompRhys avatar Oct 22 '24 15:10 CompRhys

Has it been implemented in Botorch?

https://github.com/pytorch/botorch/pull/2588/files extends _optimize_with_homotopy to include the constraints typically available in optimize_acqf which I believe addresses suggestion 2. here.

Has it been implemented in Botorch? I am currently using output constraints to learn the composition constraints.

souravdey94 avatar Oct 22 '24 18:10 souravdey94

@souravdey94 This is now merged with #2938! hope you can also make use of it.

CompRhys avatar Dec 14 '24 16:12 CompRhys

@CompRhys Thanks for the implementation. Is it already available with the latest Ax version?

souravdey94 avatar Dec 16 '24 19:12 souravdey94

no you'll have to install from git

pip install git+https://github.com/facebook/Ax.git
pip install git+https://github.com/pytorch/botorch.git

you need dev versions of both Ax and botorch

CompRhys avatar Dec 16 '24 23:12 CompRhys

Thank you. This works now

souravdey94 avatar Mar 17 '25 18:03 souravdey94