Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LPBNN_layers bug? #3

Open
jadie1 opened this issue Jun 25, 2023 · 1 comment
Open

LPBNN_layers bug? #3

jadie1 opened this issue Jun 25, 2023 · 1 comment

Comments

@jadie1
Copy link

jadie1 commented Jun 25, 2023

Hello, thanks for providing your code.

I have a question, LPBNN_layers line 75 is:
embedded_mean, embedded_logvar=self.encoder_fcmean(embedded),self.encoder_fcmean(embedded)
Should this not be:
embedded_mean, embedded_logvar=self.encoder_fcmean(embedded),self.encoder_fcvar(embedded)
As it is, it appears to be enforcing that the mean and logvar are the same. This bug is present in every layer defined in LPBNN_layers.

Additionally, I was wondering why VAE embedding is applied only for alpha and not for gamma. Is there a benefit to only defining alpha as Bayesian? Was this the case for the results reported in your paper?

@giannifranchi
Copy link
Owner

Thanks for your comment. We will correct that mistake and retrain for the end of this week.

Best,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants