Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I don't understand the Attention block size #3

Open
DBpackage opened this issue Nov 23, 2023 · 0 comments
Open

I don't understand the Attention block size #3

DBpackage opened this issue Nov 23, 2023 · 0 comments

Comments

@DBpackage
Copy link

Hi, I appreciate your nice work!

I'm confused with model shape.

image

Captured from your paper.

image

Captured from your model.py

Since you told that f size is the number of filters of the last 1D-CNN layer,
in here, f should be self.conv * 4

Then, I think Wa size should be f x f, not 2f x f
Am I misunderstand it? or just typo?

sincerely,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant