Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validate that an MO acqf is used for MOO in MBM/Acquistion #2913

Closed
wants to merge 6 commits into from

Conversation

saitcakmak
Copy link
Contributor

Summary:
This diff adds a validation that botorch_acqf_class is an MO acqf when TorchOptConfig.is_moo is True. This should eliminate bugs like #2519, which can happen since the downstream code will otherwise assume SOO.

Note that this only solves MBM side of the bug. Legacy code will still have the buggy behavior.

Differential Revision: D64563992

mpolson64 and others added 6 commits October 17, 2024 08:44
Differential Revision: D64542424
Summary:
>         opt_config_metrics: A dictionary of metrics that are included in the optimization config.

This change makes the field consistent with its docstring and helps simplify some code.

Differential Revision: D64543635
…cquisition

Summary:
The previous logic relied on imperfect proxies for `is_moo`. This information is readily available on `TorchOptConfig`, so we can directly utilize it.

Also simplified the function signature and updated the error message for robust optimization.

Differential Revision: D64545104
Summary: This does not have any usage since the multi-fidelity acquisition class has been removed. This type of functionality is better served in the acquisition function input constructors, which is what superseeded the previous multi-fidelity functionality. Removing the method helps simplify the, admittedly very complex, Acquisition constructor.

Differential Revision: D64556772
Summary: Legacy models have been deprecated for quite some time and are being cleaned up. If you're interested in using KG in Ax, you can pass in `botorch_acqf_class=qKnowledgeGradient` as  part of the `model_kwargs` to MBM (`Models.BoTorch`).

Differential Revision: D64561219
Summary:
This diff adds a validation that botorch_acqf_class is an MO acqf when `TorchOptConfig.is_moo is True`. This should eliminate bugs like facebook#2519, which can happen since the downstream code will otherwise assume SOO.

Note that this only solves MBM side of the bug. Legacy code will still have the buggy behavior.

Differential Revision: D64563992
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Oct 17, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64563992

@saitcakmak
Copy link
Contributor Author

abandoned

@saitcakmak saitcakmak closed this Oct 18, 2024
@saitcakmak saitcakmak deleted the export-D64563992 branch October 18, 2024 17:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants