Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validate that an MO acqf is used for MOO in MBM/Acquistion #2913

Closed
wants to merge 6 commits into from

Commits on Oct 17, 2024

  1. pyre upgrade

    Differential Revision: D64542424
    mpolson64 authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    f293479 View commit details
    Browse the repository at this point in the history
  2. Always set TorchOptConfig.opt_config_metrics in TorchModelBridge

    Summary:
    >         opt_config_metrics: A dictionary of metrics that are included in the optimization config.
    
    This change makes the field consistent with its docstring and helps simplify some code.
    
    Differential Revision: D64543635
    saitcakmak authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    0e282c4 View commit details
    Browse the repository at this point in the history
  3. Update selection of botorch_acqf_class in BoTorchModel._instantiate_a…

    …cquisition
    
    Summary:
    The previous logic relied on imperfect proxies for `is_moo`. This information is readily available on `TorchOptConfig`, so we can directly utilize it.
    
    Also simplified the function signature and updated the error message for robust optimization.
    
    Differential Revision: D64545104
    saitcakmak authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    31621ff View commit details
    Browse the repository at this point in the history
  4. Clean up Acquisition.compute_model_dependencies

    Summary: This does not have any usage since the multi-fidelity acquisition class has been removed. This type of functionality is better served in the acquisition function input constructors, which is what superseeded the previous multi-fidelity functionality. Removing the method helps simplify the, admittedly very complex, Acquisition constructor.
    
    Differential Revision: D64556772
    saitcakmak authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    56be5f5 View commit details
    Browse the repository at this point in the history
  5. Delete legacy KnowledgeGradient model

    Summary: Legacy models have been deprecated for quite some time and are being cleaned up. If you're interested in using KG in Ax, you can pass in `botorch_acqf_class=qKnowledgeGradient` as  part of the `model_kwargs` to MBM (`Models.BoTorch`).
    
    Differential Revision: D64561219
    saitcakmak authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    8392eec View commit details
    Browse the repository at this point in the history
  6. Validate that an MO acqf is used for MOO in MBM/Acquistion

    Summary:
    This diff adds a validation that botorch_acqf_class is an MO acqf when `TorchOptConfig.is_moo is True`. This should eliminate bugs like facebook#2519, which can happen since the downstream code will otherwise assume SOO.
    
    Note that this only solves MBM side of the bug. Legacy code will still have the buggy behavior.
    
    Differential Revision: D64563992
    saitcakmak authored and facebook-github-bot committed Oct 17, 2024
    Configuration menu
    Copy the full SHA
    9c194f5 View commit details
    Browse the repository at this point in the history