-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Relaxing constant sigma assumption #7
Comments
You could make |
Thanks for the reply! Do you mean e.g. outputting another quantity connected to NeuralProcesses/NP_architecture2.R Lines 59 to 66 in 5119ac0
That seems straightforward but I wasn't sure how to justify it since the decoder looked like it should only predict the target y's and we needed to obtain the variance elsewhere. But I guess we would be taking the samples of z in the input so that randomness is accounted for. |
Depends what kind of noise model you want to assume. The most natural one would probably be the one which assumes constant noise. E.g. in the GP-regression model, typical choice for p(y|f, x) would be Normal distribution with mean f(x) and variance \sigma^2, i.e. the latter would not depend on input x. In this case, \sigma^2 would be a single variable (not parameterised by a network). If we are interested in scenarios where noise level varies with x, then we could indeed consider parameterising \sigma^2 along the lines as you described. |
If we wanted to make this bit
NeuralProcesses/NP_architecture2.R
Lines 68 to 71 in 5119ac0
more general, what would be the correct way to do it? Would we try to estimate it from the n_draws draws of each of the y* predictions?
The text was updated successfully, but these errors were encountered: