Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistency between number of paramaters #3

Open
razvanc92 opened this issue May 4, 2020 · 3 comments
Open

Inconsistency between number of paramaters #3

razvanc92 opened this issue May 4, 2020 · 3 comments

Comments

@razvanc92
Copy link

I ran your implementation with a slightly different configuration (different dataset) and when using num_nodes: 87, enc_input_dim:1 your implementation has 223169 parameters while the original implementation with the same configuration has 371392. Do you have any ideas what could be the reason?

@xlwang233
Copy link
Owner

Hi, razvanc92
The reason that this inconsistency occurred is that I only implemented the "filter_type=laplacian" case, while the original implementation has considered all three different cases: "laplacian", "random_walk", "dual_random_walk". In fact, in the "laplacian" case, the length of self._support is 1, whereas in the "dual_random_walk" case, the length is 2.
Thank you for pointing this issue out. It actually took me a while to detect the reason. And I'll make an update to address this issue.

@xlwang233
Copy link
Owner

xlwang233 commented May 6, 2020

Hi, @razvanc92
I have modified the code. Now, when testing with the METR-LA dataset, the model produces exactly the same amount of parameters as the original implementation does.

@razvanc92
Copy link
Author

@xlwang233 Thank you, I'll take a loot asap. Also there are some other things that could be improved/fixed. At the moment you can not run the code on cpu since there are some .cuda(), instead it could be .to(device). Also in the train.py the data loader dataset is hardcoded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants