Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Possible Bug?] Inconsistency of Feature Update in models.py #8

Open
Linnore opened this issue Jun 27, 2024 · 0 comments
Open

[Possible Bug?] Inconsistency of Feature Update in models.py #8

Linnore opened this issue Jun 27, 2024 · 0 comments

Comments

@Linnore
Copy link

Linnore commented Jun 27, 2024

I notice that in the forward methods of models.py, the node and edge updates are written as:

# Inside forward methods of GINe, GATe , and PNA
# Pay attention to the `edge_attr` line
        for i in range(self.num_gnn_layers):
            x = (x + F.relu(self.batch_norms[i](self.convs[i](x, edge_index, edge_attr)))) / 2
            if self.edge_updates:
                edge_attr = edge_attr + self.emlps[i](torch.cat([x[src], x[dst], edge_attr], dim=-1)) / 2

As I understand, without /2 then the features update with a residual design. Using /2 to make an average is also ok since we don't have deep GNN here.

However, it is confusing that x is updated with average while edge_attr is not.

Also, the forward method of RGCN updates both x and edge_attr with /2:

# Inside forward methods of RGCN
# Pay attention to the `edge_attr` line
        for i in range(self.num_gnn_layers):
            x = (x + F.relu(self.batch_norms[i](self.convs[i](x, edge_index, edge_attr)))) / 2
            if self.edge_updates:
                edge_attr = (edge_attr + self.emlps[i](torch.cat([x[src], x[dst], edge_attr], dim=-1)))/ 2

Is this an intentional design?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant