You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I met some cases that when I am talking,chain model decodes right. But when I stop, it changed the result wrong(e.g. delete some words before).I wonder how to improve this. Any advice is thankful
The text was updated successfully, but these errors were encountered:
Are you using the "big-lm-const-arpa" property? The lattice generated using the smaller LM is rescored when the decoding finishes using the "big LM", and that's sometimes changes the result noticeably.
I met some cases that when I am talking,chain model decodes right. But when I stop, it changed the result wrong(e.g. delete some words before).I wonder how to improve this. Any advice is thankful
The text was updated successfully, but these errors were encountered: