Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于根据训练好的模型来预测的问题 #56

Open
zhengguanyu opened this issue Feb 22, 2022 · 2 comments
Open

关于根据训练好的模型来预测的问题 #56

zhengguanyu opened this issue Feb 22, 2022 · 2 comments

Comments

@zhengguanyu
Copy link

您好,向您请教一个问题

@zhengguanyu
Copy link
Author

将status设置成test之后,test_file文件的格式有规定吗?
我将evaluate函数里涉及到的gold_result都删掉了,在带有真实标签的测试集上是可行的。格式如下:

测 O
测 B-XXX
测 E-XXX

相当于对原有的evaluate函数微调一下,也可以预测。
但是我单独做的测试集就是空的。
test_file设置成如下格式:



但是输出的pred_results是空的。
又模仿训练集格式做成如下格式:

测 O
测 O
测 O

输出的pred_results依旧是空的。

请问是不是代码没有提供这部分功能呢?
如果有的话,请指正,感谢

@Llin1785361283
Copy link

@zhengguanyu 您好,我在调用模型进行测试的时候总是显示如下错误,请问您有遇到过类似的错误吗?
RuntimeError: Error(s) in loading state_dict for GazLSTM:
size mismatch for NERmodel.lstm.weight_ih_l0: copying a param of torch.Size([1200, 250]) from checkpoint, where the shape is torch.Size([800, 250]) in current model.
size mismatch for NERmodel.lstm.weight_hh_l0: copying a param of torch.Size([1200, 300]) from checkpoint, where the shape is torch.Size([800, 200]) in current model.
size mismatch for NERmodel.lstm.bias_ih_l0: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model.
size mismatch for NERmodel.lstm.bias_hh_l0: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model.
size mismatch for NERmodel.lstm.weight_ih_l0_reverse: copying a param of torch.Size([1200, 250]) from checkpoint, where the shape is torch.Size([800, 250]) in current model.
size mismatch for NERmodel.lstm.weight_hh_l0_reverse: copying a param of torch.Size([1200, 300]) from checkpoint, where the shape is torch.Size([800, 200]) in current model.
size mismatch for NERmodel.lstm.bias_ih_l0_reverse: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model.
size mismatch for NERmodel.lstm.bias_hh_l0_reverse: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model.
size mismatch for hidden2tag.weight: copying a param of torch.Size([7, 600]) from checkpoint, where the shape is torch.Size([7, 400]) in current model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants