Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] optimize_acqf_homotopy doesn't handle constraints #2579

Closed
CompRhys opened this issue Oct 16, 2024 · 7 comments
Closed

[Feature Request] optimize_acqf_homotopy doesn't handle constraints #2579

CompRhys opened this issue Oct 16, 2024 · 7 comments
Labels
enhancement New feature or request

Comments

@CompRhys
Copy link
Contributor

🚀 Feature Request

Would you accept a PR to add constraints to optimize_acqf_homotopy, e.g. inequality_constraints, equality_constraints and nonlinear_inequality_constraints. Looking at the code this seems more like an omission than a technical challenge to add?

Motivation

I want to compare gradient free MOO vs SEBO for a mixtures problem requiring sparsity. I believe other people are working on similar problems and would benefit from this functionality (facebook/Ax#2790)

Pitch

Describe the solution you'd like
The change appears as though it should be as simple as making all the arguements of optimize_acqf arguements of optimize_acqf_homotopy. This can either be done explicitly or via kwargs (whatever the maintainers prefer).

Are you willing to open a pull request? (See CONTRIBUTING)

I am happy to make the PR.

@CompRhys CompRhys added the enhancement New feature or request label Oct 16, 2024
@saitcakmak
Copy link
Contributor

Hi @CompRhys. Thanks for flagging this. A PR adding constraint support to optimize_acqf_homotopy would be very welcome!

@CompRhys
Copy link
Contributor Author

My preference to implement this would be to add optimize_acqf_loop_kwargs: dict[str, Any] | None, and optimize_acqf_final_kwargs: dict[str, Any] | None = None, as the args to the optimize_acqf_homotopy function as this is cleanest and prevents divergence in future, however this would be a breaking change to the code. I think it's worth splitting out two kwarg dicts because whoever implemented this originally decided to split out options and final_options suggesting that there were use cases where the final optimization should use different settings. Is this acceptable or do we need to devise a way to have this not be breaking?

@saitcakmak
Copy link
Contributor

I'd like to avoid adding catch-all dict valued arguments to the code. options kwarg goes against this but it is a common arg across optimize_acqf variant that is used to modify low-level settings that are not worth exposing as top level kwargs.

For constraints, would we expect them to be different between the homotopy steps and final optimization? They only constrain the search space, so I would expect them to be the same for both. If that's the case, we can just mirror the signature of optimize_acqf to add the three args and pass them down to both steps.

    inequality_constraints: list[tuple[Tensor, Tensor, float]] | None = None,
    equality_constraints: list[tuple[Tensor, Tensor, float]] | None = None,
    nonlinear_inequality_constraints: list[tuple[Callable, bool]] | None = None,

@CompRhys
Copy link
Contributor Author

I think one problem with not doing it via catch alls it that you end up with what's currently in the code which is very confusing about whether arguments to the loop and final optimizations are intentionally missing or accidentally missing. For example currently in the code if you passed fixed_features it would only be used in the loop optimize_acqf and not the final optimize_acqf. I think that the catch all here avoids the chances that this code needs repeated (and not clearly telegraphed) maintenance over time.

Accepting that it will lead to greater maintenance burden over time I am happy to just add in the extra arguments but if we choose not to create that two catch all dicts the only way to be philosophically consistent in the design here would be to remove the final_options argument. The fact that this was differentiated in the code already makes me believe that the original author could in principal have thought that there were important cases where optimization should be different for the loop and final optimizations and additional constraints could be an example of that.

In the draft PR I have implemented the catch-all dict method (https://github.com/pytorch/botorch/pull/2580/files) it still needs some extra tests to check that the constraints work out of the box but to me the logic is fairly clean (although I added a bunch of warnings about incorrect use of arguments).

@saitcakmak
Copy link
Contributor

For example currently in the code if you passed fixed_features it would only be used in the loop optimize_acqf and not the final optimize_acqf.

That very much sounds like a bug. I think we should fix it rather than using it as inspiration for broader design.

Accepting that it will lead to greater maintenance burden over time I am happy to just add in the extra arguments but if we choose not to create that two catch all dicts the only way to be philosophically consistent in the design here would be to remove the final_options argument. The fact that this was differentiated in the code already makes me believe that the original author could in principal have thought that there were important cases where optimization should be different for the loop and final optimizations and additional constraints could be an example of that.

@dme65 can you provide context on why we have different options at different stages?

@dme65
Copy link
Contributor

dme65 commented Oct 21, 2024

For example currently in the code if you passed fixed_features it would only be used in the loop optimize_acqf and not the final optimize_acqf.

This is indeed a bug, thanks!

can you provide context on why we have different options at different stages?

The main reason for this is that we don't need to solve the problem at a high accuracy (all we try to find is a starting point for the next subproblem that is close enough to the optimum) for all homotopy steps but the least one. As an example, we could do something like this options={"maxiter": 1000, "ftol": 1e-3}, final_options={"maxiter": 5000, "ftol": 1e-6}. This will speed up optimize_acqf_homotopy since we solve all but the last subproblem with a larger tolerance.

@CompRhys
Copy link
Contributor Author

As an example, we could do something like this options={"maxiter": 1000, "ftol": 1e-3}, final_options={"maxiter": 5000, "ftol": 1e-6}. This will speed up optimize_acqf_homotopy since we solve all but the last subproblem with a larger tolerance.

This makes sense to me if options is where you would pass the relevant kwargs. If that is the case I think the options docstring could benefit from being updated as the current docstring for options made me believe that options only corresponded to the initialization rather than the optimizer kwargs and hence no need to split. In part my original implementation with two different kwarg dicts was driven by the same idea that we might want to run the loop optimization with different settings.

        options: Options for candidate generation.

Is very vague, elsewhere in the code there are optimizer_options

facebook-github-bot pushed a commit that referenced this issue Oct 22, 2024
Summary:
This PR seeks to address: #2579

`optimize_acqf_homotopy` is a fairly minimal wrapper around `optimize_acqf` but didn't have all the constraint functionality.

This PR copies over all of the arguments that we could in principal want to use up into `optimize_acqf_homotopy`. For the time being `final_options` has been kept. The apparent bug with fixed features not being passed to the final optimization has been fixed.

a simple dict rather than `OptimizeAcqfInputs` dataclass is used to store the shared parameters.

## Related PRs

The original approach in #2580 made use of kwargs which was opposed due to being less explicit.

Pull Request resolved: #2588

Reviewed By: Balandat

Differential Revision: D64694021

Pulled By: saitcakmak

fbshipit-source-id: a10f00f0d069e411e6f12e9eafaec0eba454493d
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants