You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been trying to implement a 'local' NChooseK constraint, where one has the choice of imposing the constraint on any segment of the suggestions (e.g., x1-x6, and I want 2Choose1 on x1,x2). The code below is a runnable repro:
importrandomimporttorchfromaximportData, Experiment, ParameterType, RangeParameter, SearchSpacefromax.modelbridge.registryimportModelsfromax.runners.syntheticimportSyntheticRunnerfromtorch.quasirandomimportSobolEngineimportcopyimportnumpyasnpfromax.service.ax_clientimportAxClientfromax.modelbridge.registryimportModelsfromax.core.observationimportObservationFeaturesfrombotorch.models.gp_regressionimportSingleTaskGPfromax.utils.measurement.synthetic_functionsimportbraninfromax.models.torch.botorch_modular.surrogateimportSurrogatefrombotorch.acquisition.monte_carloimportqNoisyExpectedImprovementfromax.modelbridge.generation_strategyimportGenerationStrategy, GenerationStepfrombotorch.test_functionsimportHartmannimportnumpyasnpfromax.service.ax_clientimportAxClient, ObjectivePropertiesfromax.utils.measurement.synthetic_functionsimporthartmann6K=6-3k=3dim_of_problem=6q=1# Load our sample 2-objective problemfrombotorch.test_functions.multi_objectiveimportBraninCurrin, DTLZ2branin_currin=BraninCurrin(negate=True).to(
dtype=torch.double,
device=torch.device("cuda"iftorch.cuda.is_available() else"cpu"),
)
test_dtlz2=DTLZ2(dim=dim_of_problem, negate=True)
frombotorch.acquisitionimportExpectedImprovementfrombotorch.fitimportfit_gpytorch_modelfrombotorch.modelsimportSingleTaskGPfrombotorch.models.transformsimportStandardizefromgpytorch.mllsimportExactMarginalLogLikelihoodfromtypingimportListdefnarrow_gaussian(x, ell):
returntorch.exp(-0.5* (x/ell) **2)
defcreate_ineq_constraint(var_idx: List[int], non_zero: int):
defineq_constraint_on_vars(x: torch.Tensor, ell: float=1e-3):
""" Each callable is expected to take a `(num_restarts) x q x d`-dim tensor as an input and return a `(num_restarts) x q`-dim tensor with the constraint values. """x_slice=x[..., var_idx[0]:var_idx[1]+1]
returnnarrow_gaussian(x_slice, ell).sum(dim=-1) - (x_slice.shape[-1] -non_zero)
returnineq_constraint_on_varsdefsetup_ineqs(var_idxs: List[List[int]], non_zeros: List[int]):
""" Setup the inequality constraints for the optimization problem. """return [create_ineq_constraint(var_idx, non_zero)
forvar_idx, non_zeroinzip(var_idxs, non_zeros)]
defsetup_nonlinear_constraints_list(
# all_params: List[Dict[str, Any]],idx_list: List[int],
):
# Define the nonlinear constraints for the optional parametersnonlinear_constraints_list= []
foridxinidx_list:
nonlinear_constraints_list.append([idx, idx+1])
nonlinear_constraints=setup_ineqs(nonlinear_constraints_list, [1])
returnnonlinear_constraintsdefget_feasible_sobol_points_multisubspace(n,
list_of_nck: List[List[int]],
list_of_non_zeros: List[int],
dim_of_problem=dim_of_problem):
""" Sobol sequence where we apply N-choose-K constraints to any subset of dimensions. Args: n: number of samples to draw list_of_nck: list of form [[start1, end1], [start2, end2], ...] which specifies the start and end indices (inclusive) of the variables to be constrained list_of_non_zeros: list of form [k1, k2, ...] which specifies the number of non-zero elements in each constrained subset dim_of_problem: dimension of the problem Returns: X: n x dim_of_problem tensor with applied constraints """# Generate Sobol sequenceX=SobolEngine(dimension=dim_of_problem, scramble=True).draw(n).to(torch.double)
# Create a mask initialized with onesmask=torch.ones(n, dim_of_problem)
for (start, end), kinzip(list_of_nck, list_of_non_zeros):
subset_size=end-start+1# Generate random choices for each subsetchoices=torch.zeros(n, subset_size)
choices[:, :k] =1# Shuffle the choices for each sampleforiinrange(n):
choices[i] =choices[i][torch.randperm(subset_size)]
# Apply the choices to the maskmask[:, start:end+1] =choices# Apply mask to XX=X*maskreturnXdefget_batch_initial_conditions_multisubspace(num_restarts,
raw_samples,
q,
list_of_nck,
list_of_non_zeros
):
# X = get_feasible_sobol_points(n=raw_samples*q, k=k).unsqueeze(1)X=get_feasible_sobol_points_multisubspace(n=raw_samples*q,
list_of_nck=list_of_nck,
list_of_non_zeros=list_of_non_zeros,
dim_of_problem=6).unsqueeze(1)
X=X.reshape((torch.Size((raw_samples,q,dim_of_problem))))
# acq_vals = acqf(X)num_samples=X.shape[0] # Total number of samples in Xrandom_indices=torch.randperm(num_samples)[:num_restarts]
sampled_X=X[random_indices]
returnsampled_X# Example usagen=1000dim_of_problem=6list_of_nck= [[0, 2]] # 3-choose-2list_of_non_zeros= [1]
var_idxs=list_of_ncknon_zeros=list_of_non_zerosineq_constraints=setup_ineqs(var_idxs, non_zeros)
get_batch_initial_conditions_subset=get_batch_initial_conditions_multisubspace(5, 100, 1, list_of_nck, list_of_non_zeros)
generation_strategy=GenerationStrategy(
steps=[
GenerationStep(
model=Models.SOBOL,
num_trials=1, # https://github.com/facebook/Ax/issues/922min_trials_observed=1,
max_parallelism=6,
model_kwargs={"seed": 9999},
model_gen_kwargs={
"model_gen_options": {
}
},
),
GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {
"optimizer_kwargs": {
"nonlinear_inequality_constraints": ineq_constraints,
"batch_initial_conditions": get_batch_initial_conditions_subset,
"options": {
"batch_limit": 1,
"maxiter": 200
},
},
}
},
),
]
)
ax_client=AxClient(generation_strategy=generation_strategy)
ax_client.create_experiment(
name="hartmann_test_experiment",
parameters=[
{
"name": "x1",
"type": "range",
"bounds": [0.0, 1.0],
"value_type": "float", # Optional, defaults to inference from type of "bounds"."log_scale": False, # Optional, defaults to False.
},
{
"name": "x2",
"type": "range",
"bounds": [0.0, 1.0],
},
{
"name": "x3",
"type": "range",
"bounds": [0.0, 1.0],
},
{
"name": "x4",
"type": "range",
"bounds": [0.0, 1.0],
},
{
"name": "x5",
"type": "range",
"bounds": [0.0, 1.0],
},
# {# "name": "x5",# "type": "fixed",# "value": 0.0,# },
{
"name": "x6",
"type": "range",
"bounds": [0.0, 1.0],
},
],
objectives=
{
# `threshold` arguments are optional"a": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[0]),
"b": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[1]),
},
parameter_constraints=["x1 + x2 <= 2.0"], # Optional.# outcome_constraints=["l2norm <= 1.25"], # Optional.
)
defevaluate(parameters):
evaluation=test_dtlz2(
torch.tensor([parameters.get("x1"),
parameters.get("x2"),
parameters.get("x3"),
parameters.get("x4"),
parameters.get("x5"),
parameters.get("x6"),])
)
# In our case, standard error is 0, since we are computing a synthetic function.# Set standard error to None if the noise level is unknown.return {"a": (evaluation[0].item(), 0.0), "b": (evaluation[1].item(), 0.0)}
foriinrange(25):
parameterization, trial_index=ax_client.get_next_trial()
# Local evaluation here can be replaced with deployment to external system.ax_client.complete_trial(trial_index=trial_index, raw_data=evaluate(parameterization))
Now I have noticed that if I have a fixed parameter in my search space (as commented out above), then I get an error: ValueError: batch_initial_conditions.shape[-1] must be 5. The shape is torch.Size([5, 1, 6]).
I looked under the hood, and apparently it seems that the last dimension of the bounds tensor needs to match the last dims of batch_initial_conditions. But if there is a fixed parameter in there, apparently this doesn't contribute to the bounds? Is this meant to be like this, and if yes, is there a workaround?
The text was updated successfully, but these errors were encountered:
Abrikosoff
changed the title
Possible bug in
Problem with Fixed parameters if nonlinear_inequality_constraint is imposed
Jul 5, 2024
Hi @Abrikosoff, Fixed parameters are typically removed via the RemoveFixed transform since they aren't needed for modeling/generating candidates. The easiest thing to do would be to add the fixed parameter to the input of in the non-linear constraint callable when you define it (knowing that it won't be passed in).
Alternatively, you could remove the RemoveFixed transform and include the fixed parameter in the model/model's search space. The default transforms are Cont_X_trans + Y_trans (1, 2) and you can set the transforms by passing a list to the model_kwargs on the GenerationStep
Hi Ax Team,
I've been trying to implement a 'local' NChooseK constraint, where one has the choice of imposing the constraint on any segment of the suggestions (e.g., x1-x6, and I want 2Choose1 on x1,x2). The code below is a runnable repro:
Now I have noticed that if I have a fixed parameter in my search space (as commented out above), then I get an error:
ValueError: batch_initial_conditions.shape[-1] must be 5. The shape is torch.Size([5, 1, 6]).
I looked under the hood, and apparently it seems that the last dimension of the
bounds
tensor needs to match the last dims ofbatch_initial_conditions
. But if there is a fixed parameter in there, apparently this doesn't contribute to the bounds? Is this meant to be like this, and if yes, is there a workaround?The text was updated successfully, but these errors were encountered: