Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deep copy of distributions and parameters #30

Open
cranmer opened this issue Mar 4, 2016 · 1 comment
Open

Deep copy of distributions and parameters #30

cranmer opened this issue Mar 4, 2016 · 1 comment

Comments

@cranmer
Copy link
Member

cranmer commented Mar 4, 2016

I have some code like this (similar to n-d example)

cc_parametrized_ratio = ClassifierRatio(CalibratedClassifierCV(
    base_estimator=clf, 
    cv="prefit",  # keep the pre-trained classifier
    method="isotonic"))
cc_parametrized_ratio.fit(numerator=p0, denominator=p1, n_samples=10000)

My understanding is that the fit in the last line is basically running the calibration.
I would also guess that it is using the current value of the theano shared variables (parameters).

In the process of a likelihood scan when the parameters are changing, this would be changing both p0 and p1. Is it possible to keep p1 fixed? Ie. is there a way to take a snapshot of the distribution p1 that won't change as the theano variables change values?

@glouppe
Copy link
Contributor

glouppe commented Mar 4, 2016

My understanding is that the fit in the last line is basically running the calibration.

Yes

I would also guess that it is using the current value of the theano shared variables (parameters).

Yes

In the process of a likelihood scan when the parameters are changing, this would be changing both p0 and p1. Is it possible to keep p1 fixed?

It would be changing only if you explicitly change them. If you want to keep p1 fixed, then you should not change its parameters (and in particular, you should not define p1 using parameter and/or components objects shared across distinct distributions).

Does that answer your question?

e. is there a way to take a snapshot of the distribution p1 that won't change as the theano variables change values?

Notwithstanding, it might be helpful to define a clone method for making a deep copy of a distribution along with all its parameters (such that the parameters of the clone are actual distinct copies of the original parameters)

@glouppe glouppe changed the title clarification on CalibratedClassifierCV Deep copy of distributions and parameters Apr 20, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants