Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variationally Informed Parameterization #276

Merged
merged 20 commits into from
Dec 15, 2023
Merged

Conversation

ferrine
Copy link
Member

@ferrine ferrine commented Dec 2, 2023

This PR adds a tool to autoparameterize your model in the following way: https://arxiv.org/abs/1906.03028

image

The intended api is like this

You need to convert tour model to change its structure

image

Then you use your samplers

image

For stability, you might want to turn on clipping for lambdas

image

Copy link
Member

@ricardoV94 ricardoV94 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would put this in model/transforms/...

which is where this sort of stuff lives in the PyMC repo

@ferrine ferrine force-pushed the vip-reparameterization branch from fa8f763 to 9443b63 Compare December 2, 2023 17:55
@ferrine
Copy link
Member Author

ferrine commented Dec 2, 2023

Anything else I should do in this PR?

@ricardoV94
Copy link
Member

Content looks good. Perhaps add an entry on the docs (we have to start introducing headers and sections though)

@ricardoV94
Copy link
Member

Oh and a copy pastable example in the docstrings?

@ferrine
Copy link
Member Author

ferrine commented Dec 2, 2023

Ah, sure, docs are needed

vip_rv = model_free_rv(
vip_rv_,
vip_rv_.clone(),
transform,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ricardoV94 should I worry about this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be safer to raise if transform is not None, like you do with the size. I can only imagine people using OrderedTransform but you never know the things people try :D

)


@singledispatch
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems a bit overkill at this point?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd extend it to student T, Gamma, Logit Normal, etc

@ferrine
Copy link
Member Author

ferrine commented Dec 11, 2023

@ricardoV94 this is failing for some unknown reason

@ricardoV94
Copy link
Member

ricardoV94 commented Dec 11, 2023

@ricardoV94 this is failing for some unknown reason

I think a rewrite from the most recent PyTensor is leading to an error in the R2D2 prior tests. Either a rewrite bug or an issue with the R2D2. Will have to investigate

@ricardoV94
Copy link
Member

Ah seems to be just a warning probably by numpy when doing something with -inf... Have to check if it's inoffensive and just ignore/catch it in the test

@ferrine
Copy link
Member Author

ferrine commented Dec 12, 2023

@ricardoV94 what are the next steps?

@ricardoV94
Copy link
Member

Need to check why the warning is being triggered and see if it can be safely ignored

@ferrine ferrine force-pushed the vip-reparameterization branch from b49a13a to 7a563c9 Compare December 15, 2023 19:28
@ferrine
Copy link
Member Author

ferrine commented Dec 15, 2023

it should be all good now

@ferrine ferrine merged commit c00c368 into main Dec 15, 2023
6 checks passed
@ricardoV94 ricardoV94 added the enhancements New feature or request label Jan 3, 2024
@ricardoV94 ricardoV94 deleted the vip-reparameterization branch October 4, 2024 07:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancements New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants