Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FileNotFoundError: [Errno 2] No such file or directory: 'RUNS/2022-08-02@23:02:24-user-NF5468M6-ZINC/config/vocab.pkl' #3

Open
renly0313 opened this issue Aug 2, 2022 · 3 comments

Comments

@renly0313
Copy link

hi,
There is a bug when i run this line 'python manage.py train --dataset <DATASET_NAME>'.
Training and clustering embeddings... Traceback (most recent call last):
File "/home/zhyl_lbw/fragment-based-dgm-master/learner/dataset.py", line 77, in get_vocab
self.vocab = Vocab.load(self.config)
File "/home/zhyl_lbw/fragment-based-dgm-master/learner/skipgram.py", line 22, in load
return load_pickle(path)
File "/home/zhyl_lbw/fragment-based-dgm-master/utils/filesystem.py", line 8, in load_pickle
return pkl.load(open(path, "rb"))
FileNotFoundError: [Errno 2] No such file or directory: 'RUNS/2022-08-02@23:02:24-user-NF5468M6-ZINC/config/vocab.pkl'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/zhyl_lbw/fragment-based-dgm-master/manage.py", line 63, in
train_model(config)
File "/home/zhyl_lbw/fragment-based-dgm-master/manage.py", line 18, in train_model
vocab = dataset.get_vocab()
File "/home/zhyl_lbw/fragment-based-dgm-master/learner/dataset.py", line 79, in get_vocab
self.vocab = Vocab(self.config, self.data)
File "/home/zhyl_lbw/fragment-based-dgm-master/learner/skipgram.py", line 34, in init
train_embeddings(config, data)
File "/home/zhyl_lbw/fragment-based-dgm-master/learner/skipgram.py", line 140, in train_embeddings
sg=1)
TypeError: init() got an unexpected keyword argument 'size'

This seems to be due to not creating 'vocab.pkl'. could you help me fix it? thank you!

@marcopodda
Copy link
Owner

Hi and thanks for bringing the issue up,
it's probably because you are using an updated version of the gensim library with respect to the one used here. I have updated the code, pull the repo again, and let me know if it works now.

@venkatadithya9
Copy link

Hi, I have the same issue. I have gensim 4.2.0 installed. Here is the error message encountered while training:
Training and clustering embeddings... Traceback (most recent call last):
File "/home/Desktop/DDDD/fragment-based-dgm/learner/dataset.py", line 77, in get_vocab
self.vocab = Vocab.load(self.config)
File "/home/Desktop/DDDD/fragment-based-dgm/learner/skipgram.py", line 22, in load
return load_pickle(path)
File "/home/Desktop/DDDD/fragment-based-dgm/utils/filesystem.py", line 8, in load_pickle
return pkl.load(open(path, "rb"))
FileNotFoundError: [Errno 2] No such file or directory: 'RUNS/2022-11-07@11:17:21-user-PCBA/config/vocab.pkl'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/Desktop/DDDD/fragment-based-dgm/manage.py", line 59, in
train_model(config)
File "/home/Desktop/DDDD/fragment-based-dgm/manage.py", line 18, in train_model
vocab = dataset.get_vocab()
File "/home/Desktop/DDDD/fragment-based-dgm/learner/dataset.py", line 79, in get_vocab
self.vocab = Vocab(self.config, self.data)
File "/home/Desktop/DDDD/fragment-based-dgm/learner/skipgram.py", line 34, in init
train_embeddings(config, data)
File "/home/Desktop/DDDD/fragment-based-dgm/learner/skipgram.py", line 132, in train_embeddings
w2v = Word2Vec(
TypeError: Word2Vec.init() got an unexpected keyword argument 'iter'

@marcopodda
Copy link
Owner

It seems that many Word2Vec arguments have changed their name since the time I coded this. 'iter' is now changed to 'epochs'. It should work again, pull the repo and try again. Thanks for bringing this up!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants