-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiobjective Multifidelity BO using the Service API #2514
Comments
Hi, thank you for opening this issue.
|
Hi @Abrikosoff. I will start by saying we don't actively use MF BO internally, so this functionality is not fully battle tested. I have more detailed answers below, but after looking into this, I can't recommend you to keep using it. If you are comfortable digging around with the debugger to make sure the correct arguments are constructed and passed in, you can get it to work.
Using some mocks, I see that the MFKG acquisition function is indeed being used under the hood. Whether it is utilizing all the arguments you provided is another question. You can use a mock like this to trigger an exception & use the debugger to inspect each argument to the acquisition function:
I haven't spent too much time investigating these args, but I do see these being passed into the model.
I am pretty sure these are not the arguments you are passing in though. I believe these are the defaults constructed by the input constructor: https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L1244-L1254
I don't think this is necessarily the case. Looks like you were getting an error with Choice parameters. There shouldn't be any issue with using integer valued Range parameters though. You can update definition of
Looks like we end up fitting a
While optimizing the acquisition function, If you wanted to fix |
Summary: I believe this was originally designed based on the legacy MF models, but it appears to be incompatible with what MBM evolved to be. Here are a few reasons for removing it: - Some of kwargs that are produced by `MultiFidelityAcquisition.compute_model_dependencies` (e.g., `cost_aware_utility`) are not accepted by the MFKG input constructor. - The main kwargs that are needed by the MF acquisition functions (`cost_aware_utility`, `project`, `expand`) are readily constructed by the MF input constructors: https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L1221 - We strongly discourage subclassing MBM components and rather encourage expanding the base components to support new use cases. In this case, the MF input constructors seem to have replaced what `MultiFidelityAcquisition` originally aimed to do, while being compatible with the design philosophy. Discovered while investigating facebook#2514 Differential Revision: D58560934
Summary: I believe this was originally designed based on the legacy MF models, but it appears to be incompatible with what MBM evolved to be. Here are a few reasons for removing it: - Some of kwargs that are produced by `MultiFidelityAcquisition.compute_model_dependencies` (e.g., `cost_aware_utility`) are not accepted by the MFKG input constructor. - The main kwargs that are needed by the MF acquisition functions (`cost_aware_utility`, `project`, `expand`) are readily constructed by the MF input constructors: https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L1221 - We strongly discourage subclassing MBM components and rather encourage expanding the base components to support new use cases. In this case, the MF input constructors seem to have replaced what `MultiFidelityAcquisition` originally aimed to do, while being compatible with the design philosophy. Discovered while investigating facebook#2514 Differential Revision: D58560934
@saitcakmak Dear Sait, thanks a lot for the information! I have some follow-ups regarding the points you raised:
The actual reason I've been passing these in explicitly and not resorting to defaults is because of the docstring for
Hence I was under the impression that if I leave these out the acqf automatically reverts to a non-MF one. Is this not the case? If it is, and since you found out that the MFKG was indeed being used, can I assume that these have been passed in (of course, it might be working because the defaults were being invoked and my actual params were NOT passed in, which raises the question of how to correctly pass these)?
Thank you very much for this! Really a palm-to-the-forehead moment for me :(
Actually, in my (very limited) knowledge, isn't this how MOBO is supposed to work? If you look at the BoTorch documentation for MOBO, especially where the model is initialized, you find:
(in our case we have
Thanks for this! Edit: I have actually tried to use
and running the repro shown above, but I get thrown the error Perhaps there is a quick fix for this? Edit 2: found the tutorial for registration here, which I've redone in the form
And now I am getting the error
which makes sense as I have not yet actually defined |
@sgbaird If you are interested in updates |
Summary: Pull Request resolved: #2520 I believe this was originally designed based on the legacy MF models, but it appears to be incompatible with what MBM evolved to be. Here are a few reasons for removing it: - Some of kwargs that are produced by `MultiFidelityAcquisition.compute_model_dependencies` (e.g., `cost_aware_utility`) are not accepted by the MFKG input constructor. - The main kwargs that are needed by the MF acquisition functions (`cost_aware_utility`, `project`, `expand`) are readily constructed by the MF input constructors: https://github.com/pytorch/botorch/blob/main/botorch/acquisition/input_constructors.py#L1221 - We strongly discourage subclassing MBM components and rather encourage expanding the base components to support new use cases. In this case, the MF input constructors seem to have replaced what `MultiFidelityAcquisition` originally aimed to do, while being compatible with the design philosophy. Discovered while investigating #2514 Reviewed By: Balandat Differential Revision: D58560934 fbshipit-source-id: fc58675eff4ff81dc0a4a93084e01f8a4c8e0efc
The way Ax constructs BoTorch acquisition functions involves using acquisition function input constructors (e.g., this one) to convert the data available on the Ax experiment to the inputs expected by the acquisition function. The input constructors often define default behaviors. In the case of MFKG, this involves constructing the cost utility, expand & project arguments. The search space includes a fidelity parameter with a target fidelity, so this is used to figure out what target fidelity to project to etc.
Addressed in the other issue.
The job of the input constructor is to take the inputs passed in from Ax (this is done here) and convert them into the inputs required for the |
@saitcakmak Thanks a million!I'm keeping this open for now as I plan to come back and put up a working repro later, at which point I'll close it; in the meantime might have some more questions :-) |
@saitcakmak So I went and took a look at the code, which I am reproducing here:
My understanding is that a hypothetical
while implies that i need to define my own |
This piece of code errors out for me:
saying that there is no registered input constructor for it. This will be executed when trying to generate candidates with it, so even though you can construct a GS (which doesn't involve any checks on model kwargs), it will not work for generating candidates. I'd like to help further here but I will not be around for the next two weeks and I have other things to wrap up before I leave. Let's leave this issue open so we can follow up and fix any remaining gaps for MF support in MBM. |
@saitcakmak @esantorella Hi guys, so I went ahead and tried to set up the input constructor as so:
and defined the GS as follows:
but now I am having this error: Edit: So I did the naivest possible thing an modified the BoTorch step in the GS above to the following:
which throws me an error I wasn't expecting: Edit 2: As suggested in the tutorial, I think what should go in here is an implementation of a |
I looked into this and hacked my way around setting a
However the next problem (and that seems more serious) is that this will throw a @sdaulton you know that part of the codebase best - can you shed some light on what the limitations here are, why this isn't supported, and what would be needed to support it? |
We just haven't tested MF-HVKG with trace observations, so we would just need to do some due diligence to make sure the shapes are correct. MF-HVKG will work fine with trace-observations, but practically we should make sure the implementation supports it. @Abrikosoff, In the meantime, you can get around this simply by setting |
Dear Ax Team,
I am currently trying to run a MOMF use case in the Service API; have been mostly consulting the BoTorch tutorials for this, and so far I have come up with the following repro that seems to work, but I am not sure is working correctly (details below):
I have mainly two (or three) questions regarding this repro:
qMultiFidelityHypervolumeKnowledgeGradient
, but here it seems thatqMultiFidelityKnowledgeGradient
works as well. Why is that?x3
') is not being taken into account in the whole optimization process. What is actually the correct way to deal with this parameter?Thanks in advance, and thank you as well for the replies on my previous questions!
The text was updated successfully, but these errors were encountered: