Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

defining mu and rho for conv layers #76

Open
byungukP opened this issue Mar 21, 2024 · 1 comment
Open

defining mu and rho for conv layers #76

byungukP opened this issue Mar 21, 2024 · 1 comment

Comments

@byungukP
Copy link

According to the source code (masif/source/masif_modules), mus and sigmas for rho and theta are defined in a different way for the ones of convolutional layer 1 and the rest. While initial polar coordinates of rho and theta were used for defining mu and theta of layer 1, the order got mixed up a little bit in the rest of the additional layers. Is this a trivial mistake or is there a reason why you defined the variables in this way?

        for i in range(self.n_feat):
            self.mu_rho.append(
                tf.Variable(mu_rho_initial, name="mu_rho_{}".format(i))
            )  # 1, n_gauss
            self.mu_theta.append(
                tf.Variable(mu_theta_initial, name="mu_theta_{}".format(i))
            )  # 1, n_gauss
            self.sigma_rho.append(
                tf.Variable(
                    np.ones_like(mu_rho_initial) * self.sigma_rho_init,
                    name="sigma_rho_{}".format(i),
                )
            )  # 1, n_gauss
            self.sigma_theta.append(
                tf.Variable(
                    (np.ones_like(mu_theta_initial) * self.sigma_theta_init),
                    name="sigma_theta_{}".format(i),
                )
            )  # 1, n_gauss
        if n_conv_layers > 1:
            self.mu_rho_l2 = tf.Variable(
                mu_rho_initial, name="mu_rho_{}".format("l2")
            )
            self.mu_theta_l2 = tf.Variable(
                mu_theta_initial, name="mu_theta_{}".format("l2")
            )
            self.sigma_rho_l2 = tf.Variable(
                np.ones_like(mu_rho_initial) * self.sigma_rho_init,
                name="sigma_rho_{}".format("l2"),
            )
            self.sigma_theta_l2 = tf.Variable(
                (np.ones_like(mu_theta_initial) * self.sigma_theta_init),
                name="sigma_theta_{}".format("l2"),
            )
        if n_conv_layers > 2:
            self.mu_rho_l3 = tf.Variable(
                mu_rho_initial, name="mu_rho_{}".format("l3")
            )
            self.sigma_rho_l3 = tf.Variable(
                mu_theta_initial, name="mu_theta_{}".format("l3")
            )
            self.mu_theta_l3 = tf.Variable(
                np.ones_like(mu_rho_initial) * self.sigma_rho_init,
                name="sigma_rho_{}".format("l3"),
            )
            self.sigma_theta_l3 = tf.Variable(
                (np.ones_like(mu_theta_initial) * self.sigma_theta_init),
                name="sigma_theta_{}".format("l3"),
            )
        if n_conv_layers > 3:
            self.mu_rho_l4 = tf.Variable(
                mu_rho_initial, name="mu_rho_{}".format("l4")
            )
            self.sigma_rho_l4 = tf.Variable(
                mu_theta_initial, name="mu_theta_{}".format("l4")
            )
            self.mu_theta_l4 = tf.Variable(
                np.ones_like(mu_rho_initial) * self.sigma_rho_init,
                name="sigma_rho_{}".format("l4"),
            )
            self.sigma_theta_l4 = tf.Variable(
                (np.ones_like(mu_theta_initial) * self.sigma_theta_init),
                name="sigma_theta_{}".format("l4"),
            )

@BJWiley233
Copy link

I'm trying to follow your question. Are you questioning the order of defining? i.e.

from:
mu rho
mu theta
sigma rho
simga theta

to:
mu rho
sigma rho
mu theta
simga theta

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants