Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Param mapping #182

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
Open

Conversation

hellkite500
Copy link
Member

This should allow a calibration parameter to set an alias key, and any params across any models sharing the same alias name will be treated as the same parameter for the purpose of permutation generation.

Note, needs #180 to merge first.

@hellkite500 hellkite500 requested a review from aaraney August 28, 2024 14:23
@aaraney
Copy link
Member

aaraney commented Aug 28, 2024

This should allow a calibration parameter to set an alias key, and any params across any models sharing the same alias name will be treated as the same parameter for the purpose of permutation generation.

Just to clarify (haven't looked at the code yet), the default behavior is if two modules have the same calibration parameter name (no aliases involved) there is effectively a single parameter in the parameter space that is calibrated, so on the next iteration, both modules will get the same value for that parameter?

Related, but kind of aside, what happens (in code at HEAD) if module param combination A.a and B.a are both defined to have different bounds and default values which, if any, is given precedence?

from pydantic import BaseModel, Field
from typing import Sequence
from pydantic import BaseModel, Field, root_validator
from typing import Sequence, Mapping, Optional

class Parameter(BaseModel, allow_population_by_field_name = True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In looking at the tests, what are your thoughts on validating that:

  • init, min, and max are not nan (on the fence about this one)
  • init is in the bounds of min and max
  • min >= max?

These seem like sane invariants that we should uphold.

@aaraney
Copy link
Member

aaraney commented Aug 28, 2024

Aside, we should add a warning here if a Parameter mapping has been configured for a module that is not present.

dfs.append(_params_as_df(params, m.params.model_name))

return params

@pytest.fixture
def multi_model_shared_params2() -> Mapping[str, list[Parameter]]:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At first glance, this overlaps with multi_model_shared_params s.t. I dont think we need it? Can we just get rid of this fixture?

@@ -11,5 +11,13 @@ class Parameter(BaseModel, allow_population_by_field_name = True):
min: float
max: float
init: float
alias: Optional[str]

@root_validator
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor improvement that without could result in a misleading error.

Suggested change
@root_validator
@root_validator(skip_on_failure=True)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants