Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why is the number of nodes used as the in_dim_node of the Embedding? #28

Open
Win7ery opened this issue Jul 29, 2024 · 0 comments
Open

Comments

@Win7ery
Copy link

Win7ery commented Jul 29, 2024

net_params['in_dim'] = torch.unique(dataset.train[0][0].ndata['feat'],dim=0).size(0) # node_dim (feat is an integer) #code in main
......
in_dim_node = net_params['in_dim'] # node_dim (feat is an integer) #graphtransformer/nets/SBMs_node_classification/graph_transformer_net.py line:19
......
self.embedding_h = nn.Embedding(in_dim_node, hidden_dim) # node feat is an integer #graphtransformer/nets/SBMs_node_classification/graph_transformer_net.py line:45

Why is the number of nodes used as the in_dim_node of the embedding? What is the node feature in SBMs? I get an error when I use the new node feature because its value is greater than in_dim_node.

为什么使用节点的个数作为embedding的in_dim_node?SBMs中的节点特征是指的什么?我使用新的节点特征时由于其值大于in_dim_node导致报错。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant