Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the attention function #23

Open
TitleZ99 opened this issue Dec 9, 2022 · 1 comment
Open

About the attention function #23

TitleZ99 opened this issue Dec 9, 2022 · 1 comment

Comments

@TitleZ99
Copy link

TitleZ99 commented Dec 9, 2022

Thanks for this wonderful work. I have a question as the picture shown. When sr_ratio>1,you will do the conv first ,then plus original v and do attention function last. But when sr_ratio=1,you do the attention function first and then plus the v' after the conv. I am wondering why.

Sent from PPHub

@TitleZ99
Copy link
Author

TitleZ99 commented Dec 9, 2022

29232fa1517c000291d70fddf86f481

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant