-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我也是参加这个比赛的,研究了您的思路受益匪浅,想请教您一个问题 #3
Comments
你好。这个RCNN部分我也是参照了别人的工作,原始的是在第二维上进行最大池化,当时我实验效果不好,后来通过点积操作获得概率分布,然后将向量与概率分布向量相乘得到的attention结果。这样才有了一定提升,但是提升效果也不大。
| |
m13021933043
邮箱:[email protected]
|
Signature is customized by Netease Mail Master
在2019年12月12日 16:37,pbz123 写道:
您将MAXPOOLing换成attention这里我看的有一些不太明白,不知道您可不可以帮我解释一下这部分的具体操作过程
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
好的,感谢您的回答。那送入attention的矩阵就是同样的句子从RCNN和LSTM分别出来后直接拼接起来是吗?还是说拼接这里有一些特殊的方法呀 |
我的拼接操作是这样的,Bert得到的矩阵A,然后A通过双向lstm得到B,最后将A和B拼接后的结果送入attention
| |
m13021933043
邮箱:[email protected]
|
Signature is customized by Netease Mail Master
在2019年12月12日 19:21,pbz123 写道:
好的,感谢您的回答。那送入attention的矩阵就是同样的句子从RCNN和LSTM分别出来后直接拼接起来是吗?还是说拼接这里有一些特殊的方法呀
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The text was updated successfully, but these errors were encountered: