You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the repo, when I am evaluating the model, input_sentence_i does not seem to be influenced by input_sentence_i-1. Is it possible to run the model in inference mode but so that it retains memory of the previous sentences you entered?
Thanks
The text was updated successfully, but these errors were encountered:
Hi, the vanilla Transformer (which this repository implements) does not incorporate information across data points so the closest thing you could do with this model is to combine multiple input sentences and treat it as a single input. Recent work by Kossen et al. attempts to apply self-attention across data points, you might wanna have a look!
Hey guys,
Thanks for the repo, when I am evaluating the model,
input_sentence_i
does not seem to be influenced byinput_sentence_i-1
. Is it possible to run the model in inference mode but so that it retains memory of the previous sentences you entered?Thanks
The text was updated successfully, but these errors were encountered: