We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我自己用Transformer架构复现了一个Seq2Edit, 目前仅支持单轮纠正,但是我在训练的时候发现nonkeep标签正确率挺低的,我检查了输入,labels, d_tags都是和原文代码一致的。我觉得很有可能是训练过程中的问题,原文的代码训练过程在Allennlp的高度封装之下,我担心因为训练设置的不同导致性能没那么好。(在自己的数据集上训练,原文代码能达到80%多,但是我自己复现的只有30%多)
The text was updated successfully, but these errors were encountered:
可以参考这个仓库:https://github.com/cofe-ai/fast-gector 基于原生torch和deepspeed写的
Sorry, something went wrong.
No branches or pull requests
您好,我自己用Transformer架构复现了一个Seq2Edit, 目前仅支持单轮纠正,但是我在训练的时候发现nonkeep标签正确率挺低的,我检查了输入,labels, d_tags都是和原文代码一致的。我觉得很有可能是训练过程中的问题,原文的代码训练过程在Allennlp的高度封装之下,我担心因为训练设置的不同导致性能没那么好。(在自己的数据集上训练,原文代码能达到80%多,但是我自己复现的只有30%多)
The text was updated successfully, but these errors were encountered: