-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coverage Mechanism and Coverage Loss #180
Comments
There are no plans to add these features but contributions are welcome. It is presently a bit complicated to customize the RNN decoder as we use the high-level |
@wanghm92 In case you are not aware, OpenNMT-py does support a training option called "coverage_attn" which I have used to solve a problem somewhat similar to yours. My use case is for learning a strictly token-by-token mapping from the source sequence to the target sequence, which does not allow for any unwanted repetition or additional/missing tokens during the translation. This is hard to enforce under OpenNMT-tf,but so far OpenNMT-py seems to work well for my purposes. |
@guillaumekln @kaihuchen Thanks a lot for the replies! |
@wanghm92
|
@kaihuchen I see. I'm not sure if the developer forgot to delete the 'not supported' note or it is still under development. Would appreciate a clarification from the developers @guillaumekln if possible. |
For any query about OpenNMT-py, please open issues to the dedicated repository. Thanks. |
I see this discussion happened three years ago. Are there any plans to work on these features at the moment? |
There is no plan to work on this at the moment, but I would accept a PR adding these features. |
May I ask if there is any plan adding the coverage attention mechanism (https://arxiv.org/pdf/1601.04811.pdf) and coverage loss (https://arxiv.org/pdf/1704.04368.pdf) to the decoder, as these could potentially help alleviating the repetition problem in generation?
Or, any hints on a quick implementation? Thanks!
The text was updated successfully, but these errors were encountered: