-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IndexError: too many indices for tensor of dimension 1 #37
Comments
Even if there's only one sentence batch_size would be 1 and the function properly handles this. Do you have the correct data format? |
Thanks for your reply. I follow the instructions to preprocess my own dataset. I am sorry that maybe I didn't discribe the problem clearly. |
Thanks for the explanation. I understand the issue now. If you are working on a dataset that only has 1-sentence summary output this repository might not be the best choice for you. Our work's main improvement is over long-generation (multi-sentence) summarization. My plan for this repo is to only support CNN/DM dataset so I won't change the code for customized datasets. You are free to make a fork and write new codes for your customization. Sorry for the inconveniences. (I think an ad-hoc fix is to pad an extra dummy sentence and mask it out before loss computation.) However, this could be a bug if we get very unlucky. Say there are some examples in CNN/DM that have 1-sent summary and we sample 1 mini-batch that all of them are 1-sent it will result in error. I will need to check on this. Unfortunately, due to my busy schedule this is a lower priority. In the meanwhile I will keep this issue open for discussion. |
Thanks for the explanation and advice! I'll have a try. |
Hey @nefujiangping, I am currently running into the same issue. Have you thought of any solutions for this? |
Hello @nickluijtgaarden, sorry, I didn't fix this yet (And I don't use this code for now). You can try the solution mentioned above. |
hey guys, |
batcher.py#L119. There is an ERROR when the target summary has one sentence only.
The text was updated successfully, but these errors were encountered: