We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问代码relative_transformer.py种127行的E_如何理解,看了论文中在计算attention score时似乎没有这一项?
The text was updated successfully, but these errors were encountered:
嗯,论文中没有这一项。我们经验性的发现这一项可以让训练效果更加稳定,所以就在新版的代码中添加了这一项。可以理解为需要知道当前key和query的相对位置来决定对这个key的bias。
Sorry, something went wrong.
原来是这样啊,我看了半天,幸好跑来github看到了这个issue > <
No branches or pull requests
请问代码relative_transformer.py种127行的E_如何理解,看了论文中在计算attention score时似乎没有这一项?
The text was updated successfully, but these errors were encountered: