You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Word2vec learns two matrices--call them an embedding and a context vector (although this might not be appropriate for every method). The embedding matrix is the "model" that's returned by this package; it can be extracted by model <- train_word2vec(...) or read.vectors(...). The context vector is (I think) just thrown away after training, which is too bad because it can be useful in some situations.
Curious if you really want both or just the embedding matrix. I'd be willing to leave an issue open for the context vectors if there's a use for them.
Hey---great work here. Sorry if this is obtuse, but is there a way to extract the neural network weight matrices after training?
Thanks!
The text was updated successfully, but these errors were encountered: