Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于attention计算的实现 #1

Open
xbingsun opened this issue Nov 10, 2022 · 0 comments
Open

关于attention计算的实现 #1

xbingsun opened this issue Nov 10, 2022 · 0 comments

Comments

@xbingsun
Copy link

您好,请问AMNet在论文中计算节点和滤波向量attention权重的时候,利用了如下公式:
$$w_k^i = q^T \cdot tanh(W^Z {z_k^i}^T + W^X x_i)$$
但是在代码实验中按照了以下公式:
$$w_k^i = tanh(W^Z {z_k^i}^T)^T \cdot tanh(W^X x_i)$$
因为这两种方式并不等价,所以想请问这个细节为什么和论文中不是对应的呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant