Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-fidelity optimization with discrete fidelities #979

Closed
lxk42 opened this issue May 29, 2022 · 2 comments
Closed

Multi-fidelity optimization with discrete fidelities #979

lxk42 opened this issue May 29, 2022 · 2 comments
Labels
wishlist Long-term wishlist feature requests

Comments

@lxk42
Copy link

lxk42 commented May 29, 2022

Hello and first of all thank you for this great package!

I am trying to run a "Multi-Fidelity Bayesian optimization with discrete fidelities using KG", similar to the BoTorch tutorial of the same name. I found an example in issue #475, in which a continuous fidelity parameter is used, and tried to adjust it for a discrete fidelity. Simply changing the fidelity from a RangeParameter to a ChoiceParameter produces an error, since apparently the fidelity is not supposed to be "choice-encoded", see the code example and traceback below.

Is the usage of discrete fidelities not supported or am I missing some additional arguments?

Thanks a lot!

from ax.service.ax_client import AxClient
from botorch.test_functions.multi_fidelity import AugmentedHartmann
from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy
from ax.modelbridge.registry import Models
import torch

problem = AugmentedHartmann(negate=True)

def objective(parameters):
    # x7 is the fidelity
    x = torch.tensor([parameters.get(f"x{i+1}") for i in range(7)])
    return {'f': (problem(x).item(), 0.0)}

gs = GenerationStrategy(
    steps=[
        GenerationStep(
            model=Models.SOBOL,
            num_trials=16,
        ),
        GenerationStep(
            model=Models.GPKG,
            num_trials=-1,
            model_kwargs={'cost_intercept': 5},
            model_gen_kwargs={"num_fantasies": 128},
        ),
    ]
)

ax_client = AxClient(generation_strategy=gs)
ax_client.create_experiment(
    name="hartmann_mf_experiment",
    parameters=[
        {
            "name": "x1",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x2",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x3",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x4",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x5",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x6",
            "type": "range",
            "bounds": [0.0, 1.0],
        },
        {
            "name": "x7",
            "type": "choice",
            "values": [0.5, 0.75, 1.0],
            "is_fidelity": True,
            "is_ordered": True,
            "target_value": 1.0
        },
    ],
    objective_name="f",
)

for i in range(20):
    parameters, trial_index = ax_client.get_next_trial()
    ax_client.complete_trial(trial_index=trial_index, raw_data=objective(parameters))

Running this code produces the following error:

Traceback (most recent call last):

  File "/home/alex/code/ax/discrete_KG.py", line 84, in <module>
    parameters, trial_index = ax_client.get_next_trial()

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/utils/common/executils.py", line 147, in actual_wrapper
    return func(*args, **kwargs)

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/service/ax_client.py", line 466, in get_next_trial
    generator_run=self._gen_new_generator_run(), ttl_seconds=ttl_seconds

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/service/ax_client.py", line 1551, in _gen_new_generator_run
    return not_none(self.generation_strategy).gen(

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/generation_strategy.py", line 332, in gen
    return self._gen_multiple(

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/generation_strategy.py", line 455, in _gen_multiple
    self._fit_or_update_current_model(data=data)

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/generation_strategy.py", line 511, in _fit_or_update_current_model
    self._fit_current_model(data=self._get_data_for_fit(passed_in_data=data))

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/generation_strategy.py", line 657, in _fit_current_model
    self._curr.fit(experiment=self.experiment, data=data, **model_state_on_lgr)

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/generation_node.py", line 128, in fit
    model_spec.fit(  # Stores the fitted model as `model_spec._fitted_model`

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/model_spec.py", line 127, in fit
    self._fitted_model = self.model_enum(

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/registry.py", line 342, in __call__
    model_bridge = bridge_class(

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/base.py", line 168, in __init__
    obs_feats, obs_data, search_space = self._transform_data(

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/base.py", line 214, in _transform_data
    search_space = t_instance.transform_search_space(search_space)

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/transforms/base.py", line 90, in transform_search_space
    return self._transform_search_space(search_space=search_space)

  File "/home/alex/miniconda3/envs/science/lib/python3.9/site-packages/ax/modelbridge/transforms/choice_encode.py", line 161, in _transform_search_space
    raise ValueError(

ValueError: Cannot choice-encode fidelity parameter x7
@Balandat
Copy link
Contributor

Hmm looking through the code it seems like that at this point Ax does indeed not support discrete fidelity parameters. I don't think there are any hard blockers to supporting this (after all it works in BoTorch), but we'd probably have to take a closer look at the Modelbridge and Transform layer and make some changes there to support this.

cc @danielrjiang, @dme65

@lena-kashtelyan
Copy link
Contributor

At this time I think this is a wishlist item, so I'll merge it into our wishlist master-task!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

3 participants