Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug #2

Closed
m3ditatioin opened this issue May 19, 2021 · 8 comments
Closed

bug #2

m3ditatioin opened this issue May 19, 2021 · 8 comments
Labels
invalid This doesn't seem right

Comments

@m3ditatioin
Copy link

i tried to run the code for a long time, but the code always throw the bug ,can u tell me how to fix it?

alpha = softmax(alpha, edge_index_i,size_i)
RuntimeError: softmax() Expected a value of type 'Optional[Tensor]' for argument 'ptr' but instead found type 'int'.
Position: 2
Value: 864
Declaration: softmax(Tensor src, Tensor? index=None, Tensor? ptr=None, int? num_nodes=None, int dim=0) -> (Tensor)
Cast error details: Unable to cast Python instance to C++ type (compile in debug mode for details)

@d-ailin
Copy link
Owner

d-ailin commented May 19, 2021

Hi, would you mind sharing a more detailed(or complete) traceback log? Thanks.

@iDestro
Copy link

iDestro commented Jun 8, 2021

Thank you for your open code. I have the same problem with m3ditatioin.

Traceback (most recent call last):
File "D:/迅雷下载/GDN/GDN/main.py", line 259, in
main.run()
File "D:/迅雷下载/GDN/GDN/main.py", line 109, in run
self.train_log = train(self.model, model_save_path,
File "D:\迅雷下载\GDN\GDN\train.py", line 69, in train
out = model(x, edge_index).float().to(device)
File "C:\Users\Administrator.conda\envs\torch-geo\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(input, **kwargs)
File "D:\迅雷下载\GDN\GDN\models\GDN.py", line 163, in forward
gcn_out = self.gnn_layers[i](x, batch_gated_edge_index, node_num=node_numbatch_num, embedding=all_embeddings)
File "C:\Users\Administrator.conda\envs\torch-geo\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "D:\迅雷下载\GDN\GDN\models\GDN.py", line 73, in forward
out, (new_edge_index, att_weight) = self.gnn(x, edge_index, embedding, return_attention_weights=True)
File "C:\Users\Administrator.conda\envs\torch-geo\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "D:\迅雷下载\GDN\GDN\models\graph_layer.py", line 66, in forward
out = self.propagate(edge_index, x=x, embedding=embedding, edges=edge_index,
File "C:\Users\Administrator.conda\envs\torch-geo\lib\site-packages\torch_geometric\nn\conv\message_passing.py", line 237, in propagate
out = self.message(**msg_kwargs)
File "D:\迅雷下载\GDN\GDN\models\graph_layer.py", line 112, in message
alpha = softmax(alpha, edge_index_i, size_i)
RuntimeError: softmax() Expected a value of type 'Optional[Tensor]' for argument 'ptr' but instead found type 'int'.
Position: 2
Value: 3456
Declaration: softmax(Tensor src, Tensor? index, Tensor? ptr=None, int? num_nodes=None) -> (Tensor)
Cast error details: Unable to cast Python instance to C++ type (compile in debug mode for details)

Process finished with exit code 1

@d-ailin
Copy link
Owner

d-ailin commented Jun 10, 2021

Hi, thanks for providing the log.

I think this issue is caused by the mismatched pytorch-geometric version. Our code is using version 1.5.0. Pytorch-geometric has changed the softmax function since 1.6.0.

Please try to reinstall the pytorch-geometric with 1.5.0 and rerun it. Thanks!

@d-ailin d-ailin closed this as completed Jun 22, 2021
@d-ailin d-ailin pinned this issue Aug 3, 2021
@d-ailin d-ailin added the invalid This doesn't seem right label Aug 3, 2021
@wjj5881005
Copy link

Can you update the code to adapt with new latest version of geometric?

@Sudo42b
Copy link

Sudo42b commented Nov 2, 2021

@d-ailin @wjj5881005 modified

in graph_layer.py

self.node_dim=0
alpha = softmax(alpha, edge_index_i, num_nodes=size_i)

@meihuameii
Copy link

@iDestro Hello! Do you have solved this problem?

@houchenyu
Copy link

houchenyu commented Nov 18, 2021

@d-ailin @wjj5881005 modified

in graph_layer.py

self.node_dim=0
alpha = softmax(alpha, edge_index_i, num_nodes=size_i)

Thanks for your help.
But I encounter a strange problem: If I omit the code self.node_dim=0, it will report a wrong issue on return x_j * alpha.view(-1, self.heads, 1); If I add this code, it works. Why is this line of code so important?

@itanfeng
Copy link

itanfeng commented Jul 4, 2023

very good!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid This doesn't seem right
Projects
None yet
Development

No branches or pull requests

8 participants