Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trining on big files 25+ MB gets killed #13

Open
Yyyyaaaannnnoooo opened this issue Apr 26, 2019 · 1 comment
Open

Trining on big files 25+ MB gets killed #13

Yyyyaaaannnnoooo opened this issue Apr 26, 2019 · 1 comment

Comments

@Yyyyaaaannnnoooo
Copy link

I'm training the LSTM with some 80 MB files with the specified hyperparameters

python train.py --data_dir=./data    --rnn_size 2048    --num_layers 2   --seq_length 256 --batch_size 128 --output_keep_prob 0.25

but after few minutes the job gets killed.
Is the file too big?

@lucas-fine
Copy link

I did this while using TOP and after about a minute my computer froze from processor being almost %100 my guess is that your computer killed it because it was to much to handle try an easier command or use a more powerful computer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants