-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to visualize the attention? #29
Comments
Just propagate attention BMT/model/multihead_attention.py Line 19 in d45ad8f
to the output of the modules and apply an aggregation function of your liking. |
Thank you. I'll try that. |
Sorry, I didn't quite catch your meaning. Could you explain it in more detail?In which file should this line of code be added? |
I meant that you will need to add this variable to the output of each module. |
Thank you. I will continue to study this issue |
Do you have a way to visualize the attention mechanism?I want to know what exactly does the attention mechanism focus on?
The text was updated successfully, but these errors were encountered: