Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Inequality Constraints to SEBO #2938

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

CompRhys
Copy link
Contributor

@CompRhys CompRhys commented Oct 22, 2024

Following pytorch/botorch#2588 being merged there is now no reason to stop people using inequality constraints with SEBO. This PR removes the ValueError check and adjusts the tests accordingly.

Connected to #2790

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Oct 22, 2024
Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - if you don't mind fixing that one small linter nit the rest lgtm!

ax/models/torch/tests/test_sebo.py Outdated Show resolved Hide resolved
Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually there are a couple of other lint failures as well: https://github.com/facebook/Ax/actions/runs/11468956837/job/31951921724?pr=2938

@codecov-commenter
Copy link

codecov-commenter commented Oct 23, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95.65%. Comparing base (a85c7ee) to head (a560e9f).
Report is 50 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2938   +/-   ##
=======================================
  Coverage   95.64%   95.65%           
=======================================
  Files         486      486           
  Lines       48871    48879    +8     
=======================================
+ Hits        46745    46753    +8     
  Misses       2126     2126           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@CompRhys
Copy link
Contributor Author

Merged the changes from #2935 and then fixed the flake8 complaints. I just went ahead and fixed everything flake8 complained about in the repo rather than just the files I touched.

So used to relying on pre-commit that I always forget to lint manually when there's not a hook.

@facebook-github-bot
Copy link
Contributor

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, you will want to make sure that you take into account the inequality constraints when in get_batch_initial_conditions() when you call this inside _optimize_with_homotopy()

@CompRhys
Copy link
Contributor Author

The naive solution would be to set it to None if inequality constraints are given which would then cause it to default to the behavior in botorch's gen_batch_initial_conditions. Would that be sufficient to be an acceptable solution here? Whether this is okay depends on how important the additional sampling around the pareto front is to performance (ping @dme65 as this was changed in the last few days).

Adding something more involved would require a bit more thought.

@souravdey94
Copy link

Is SEBO available with parameter constraints in the latest version of Ax?

@CompRhys
Copy link
Contributor Author

No this was closed as I did a force-push and it broke the connection to this repo. I hope to get this sorted this week.

@CompRhys CompRhys reopened this Nov 20, 2024
@CompRhys
Copy link
Contributor Author

Actually, you will want to make sure that you take into account the inequality constraints when in get_batch_initial_conditions() when you call this inside _optimize_with_homotopy()

@Balandat Finally getting back to this, in order to reduce the behaviour change and be backward compatible people using sebo + L0 I added pytorch/botorch#2636 to botorch and then we can use the botorch method to sample with constraints and then concat the points from the pareto front to that.

with the L1 setup I believe that changing the behaviour to also make use of the pareto points + noise should help there so I shifted the logic such that the same initial batch conditions are used for both the L1 and L0 choices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants