Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross Attention Computation in LinearSelfAttention() #81

Open
goutamyg opened this issue Jul 31, 2023 · 0 comments
Open

Cross Attention Computation in LinearSelfAttention() #81

goutamyg opened this issue Jul 31, 2023 · 0 comments

Comments

@goutamyg
Copy link

goutamyg commented Jul 31, 2023

Hi,

I have a question regarding the computation of cross-attention in https://github.com/apple/ml-cvnets/blob/main/cvnets/layers/linear_attention.py#L163

Here the Query and Key are generated from the input x_prev, and the Value is generated from the input x. However, in general, Query is generated from one of the inputs, and the other input is used to generate the Key and Value, for example: https://vaclavkosar.com/images/cross-attention-in-transformer-architecture.png

Can you please help me understand the idea behind your implementation of cross-attention?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant