Fully Bayesian model error
When I run the templated script for any of the "Fully Bayesian" methods I receive the following attribute error:
AttributeError Traceback (most recent call last)
Cell In[1], [line 32](vscode-notebook-cell:?execution_count=1&line=32)
[13](vscode-notebook-cell:?execution_count=1&line=13) y = float(
[14](vscode-notebook-cell:?execution_count=1&line=14) (x2 - 5.1 / (4 * np.pi**2) * x1**2 + 5.0 / np.pi * x1 - 6.0) ** 2
[15](vscode-notebook-cell:?execution_count=1&line=15) + 10 * (1 - 1.0 / (8 * np.pi)) * np.cos(x1)
[16](vscode-notebook-cell:?execution_count=1&line=16) + 10
[17](vscode-notebook-cell:?execution_count=1&line=17) )
[19](vscode-notebook-cell:?execution_count=1&line=19) return y
[22](vscode-notebook-cell:?execution_count=1&line=22) gs = GenerationStrategy(
[23](vscode-notebook-cell:?execution_count=1&line=23) steps=[
[24](vscode-notebook-cell:?execution_count=1&line=24) GenerationStep(
[25](vscode-notebook-cell:?execution_count=1&line=25) model=Models.SOBOL,
[26](vscode-notebook-cell:?execution_count=1&line=26) num_trials=4, # how many sobol trials to perform (rule of thumb: 2 * number of params)
[27](vscode-notebook-cell:?execution_count=1&line=27) min_trials_observed=3,
[28](vscode-notebook-cell:?execution_count=1&line=28) max_parallelism=5,
[29](vscode-notebook-cell:?execution_count=1&line=29) model_kwargs={"seed": 999},
[30](vscode-notebook-cell:?execution_count=1&line=30) ),
[31](vscode-notebook-cell:?execution_count=1&line=31) GenerationStep(
---> [32](vscode-notebook-cell:?execution_count=1&line=32) model=Models.FULLYBAYESIAN,
[33](vscode-notebook-cell:?execution_count=1&line=33) num_trials=-1,
[34](vscode-notebook-cell:?execution_count=1&line=34) max_parallelism=3,
[35](vscode-notebook-cell:?execution_count=1&line=35) model_kwargs={"num_samples": 512, "warmup_steps": 512},
[36](vscode-notebook-cell:?execution_count=1&line=36) ),
[37](vscode-notebook-cell:?execution_count=1&line=37) ]
[38](vscode-notebook-cell:?execution_count=1&line=38) )
[40](vscode-notebook-cell:?execution_count=1&line=40) ax_client = AxClient(generation_strategy=gs)
[42](vscode-notebook-cell:?execution_count=1&line=42) ax_client.create_experiment(
[43](vscode-notebook-cell:?execution_count=1&line=43) parameters=[
[44](vscode-notebook-cell:?execution_count=1&line=44) {"name": "x1", "type": "range", "bounds": [-5.0, 10.0]},
(...)
[49](vscode-notebook-cell:?execution_count=1&line=49) },
[50](vscode-notebook-cell:?execution_count=1&line=50) )
File C:\Program Files\Python310\lib\enum.py:437, in EnumMeta.__getattr__(cls, name)
[435](file:///C:/Program%20Files/Python310/lib/enum.py:435) return cls._member_map_[name]
[436](file:///C:/Program%20Files/Python310/lib/enum.py:436) except KeyError:
--> [437](file:///C:/Program%20Files/Python310/lib/enum.py:437) raise AttributeError(name) from None
AttributeError: FULLYBAYESIAN
I am running on Python v.3.10.5
I'll check it out
Looks like some of these were removed: https://github.com/facebook/Ax/releases/tag/0.5.0
You can %pip install ax-platform==0.4.* for now.
Then we'll work on refactoring honegumi for 0.5. If you have a suggested fix for 0.5, let us know. Otherwise, we'll work on the conversion to BOTORCH_MODULAR for all models.
Models.SAASBO might still work. Looks like that's still part of Models.
@AniketChitre, the following should get you up and running in the meantime: https://colab.research.google.com/gist/sgbaird/e64d0c97056b3b0031cf96744d5e0634/saasbo.ipynb
%pip install ax-platform==0.4.3
import numpy as np
from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.modelbridge.factory import Models
from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy
obj1_name = "branin"
def branin(x1, x2):
y = float(
(x2 - 5.1 / (4 * np.pi**2) * x1**2 + 5.0 / np.pi * x1 - 6.0) ** 2
+ 10 * (1 - 1.0 / (8 * np.pi)) * np.cos(x1)
+ 10
)
return y
gs = GenerationStrategy(
steps=[
GenerationStep(
model=Models.SOBOL,
num_trials=4, # how many sobol trials to perform (rule of thumb: 2 * number of params)
min_trials_observed=3,
max_parallelism=5,
model_kwargs={"seed": 999},
),
GenerationStep(
model=Models.SAASBO,
num_trials=-1,
max_parallelism=3,
),
]
)
ax_client = AxClient(generation_strategy=gs)
ax_client.create_experiment(
parameters=[
{"name": "x1", "type": "range", "bounds": [-5.0, 10.0]},
{"name": "x2", "type": "range", "bounds": [0.0, 10.0]},
],
objectives={
obj1_name: ObjectiveProperties(minimize=True),
},
)
for i in range(6):
parameterization, trial_index = ax_client.get_next_trial()
# extract parameters
x1 = parameterization["x1"]
x2 = parameterization["x2"]
results = branin(x1, x2)
ax_client.complete_trial(trial_index=trial_index, raw_data=results)
best_parameters, metrics = ax_client.get_best_parameters()
cc @AndrewFalkowski
Aside: package version should probably go into the Jinja templates. Likewise, #23 should help, too
@AniketChitre are you still blocked on this?
After some digging: fit_fully_bayesian_model_nuts is called in conjunction with FullyBayesianSingleTaskGP (GitHub).
The two kwargs of interest seem to be associated with the fit function, but I don't think Botorch Modular or generation strategy docs cover this use-case (of injecting kwargs into the fit function, which isn't the typical fit function).
Aside: there's also SaasFullyBayesianSingleTaskGP. Not sure if this relates to the differences between frequentist vs. fully Bayesian. @AndrewFalkowski ?
Related changes:
- https://github.com/facebook/Ax/pull/2426/files
- https://github.com/facebook/Ax/pull/2425/commits
- https://github.com/saitcakmak/Ax/blob/bbb12bd1aa4173079a89735eefca69d45eb691be/ax/modelbridge/registry.py#L238-L250
@saitcakmak, is there an easy fix here to be able to control num_samples and warmup_steps (assuming ax-platform==0.4.3 for now)? If not, I may just leave out these kwargs for now and let them go to defaults (though typically we'd like to increase them since we're not as worried about overhead). For context, here's a Colab reproducer. If the answer is "this is really where you should use BOTORCH_MODULAR instead of the Models registry", we can roll with that too 🙂
Hi @sgbaird. You can control num_samples and warmup_steps of fit_fully_bayesian_model_nuts by passing in these as part of the mll_options dict of the corresponding ModelConfig. This setup is not ideal, since the mll_options mean different things for MAP & fully-Bayesian models but it's what we have right now.
If we take the existing registry entry
"SAASBO": ModelSetup(
bridge_class=TorchAdapter,
model_class=ModularBoTorchGenerator,
transforms=MBM_X_trans + Y_trans,
default_model_kwargs={
"surrogate_spec": SurrogateSpec(
botorch_model_class=SaasFullyBayesianSingleTaskGP
)
},
)
this would look like
"SAASBO": ModelSetup(
bridge_class=TorchAdapter,
model_class=ModularBoTorchGenerator,
transforms=MBM_X_trans + Y_trans,
default_model_kwargs={
"surrogate_spec": SurrogateSpec(
model_configs=[
ModelConfig(
botorch_model_class=SaasFullyBayesianSingleTaskGP,
mll_options={"num_samples": 256, "warmup_steps": 512, "thinning": 16}, # just the defaults
)
]
)
},
)
Thank you! I can give this a try
Also worth noting that you can always overwrite the surrogate_spec argument when constructing a GenerationNode/Step by passing it in as part of the model_kwargs. That's typically easier than adding a custom registry entry.
@AndrewFalkowski any thoughts on how to best proceed?
@saitcakmak
For the following:
gs = GenerationStrategy(
steps=[
GenerationStep(
model=Models.SOBOL,
num_trials=4, # https://github.com/facebook/Ax/issues/922
min_trials_observed=3,
max_parallelism=5,
model_kwargs={"seed": 999},
model_gen_kwargs={},
),
GenerationStep(
model=Models.FULLYBAYESIAN,
num_trials=-1,
max_parallelism=3,
model_kwargs={},
),
]
)
Would it be something like model_kwargs={"surrogate_spec": {"mll_options": {"num_samples": 256, "warmup_steps": 512, "thinning": 16}},?
Hi @sgbaird! I just checked in this new tutorial that demonstrates these: https://ax.dev/docs/next/tutorials/modular_botorch/#the-modular-botorch-generator
In this case, you'd use something like this to avoid discarding the model class specified by Models.FULLYBAYESIAN (since it is specified using a surrogate_spec and you're providing an override):
surrogate_spec = SurrogateSpec(
model_configs=[
ModelConfig(
botorch_model_class= SaasFullyBayesianSingleTaskGP,
mll_options={"num_samples": 256, "warmup_steps": 512, "thinning": 16},
)
],
)
model_kwargs={
"surrogate_spec": surrogate_spec,
}