We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Originally posted by haifaksh October 13, 2022 Hello,
How can I run run fairest-train on GPU for Mac M1 max ?
I tried the below code on Jupyter notebook but when I check the activity monitor, it is only using CPU and GPU
torch.device = "mps" !fairseq-train /Users/ha/data/preprocessed --max-epoch 7 \ --encoder-normalize-before --decoder-normalize-before \ --arch transformer --layernorm-embedding \ --task translation_multi_simple_epoch \ --sampling-method "temperature" \ --sampling-temperature 1.5 \ --encoder-langtok "src" \ --decoder-langtok \ --lang-pairs "$LANGPAIRS" \ --criterion label_smoothed_cross_entropy --label-smoothing 0.2 \ --optimizer adam --adam-eps 1e-06 --adam-betas '(0.9, 0.98)' \ --lr-scheduler inverse_sqrt --lr 3e-04 --warmup-updates 2500 --max-update 40000 \ --dropout 0.3 --attention-dropout 0.1 --weight-decay 0.0 \ --max-tokens 1024 --max-tokens-valid 1024 --update-freq 2 \ --save-interval 1 --save-interval-updates 5000 --keep-interval-updates 10 --no-epoch-checkpoints \ --seed 222 --log-format simple --log-interval 50 --ddp-backend=legacy_ddp
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Discussed in #1
Originally posted by haifaksh October 13, 2022
Hello,
How can I run run fairest-train on GPU for Mac M1 max ?
I tried the below code on Jupyter notebook but when I check the activity monitor, it is only using CPU and GPU
torch.device = "mps" !fairseq-train /Users/ha/data/preprocessed --max-epoch 7 \ --encoder-normalize-before --decoder-normalize-before \ --arch transformer --layernorm-embedding \ --task translation_multi_simple_epoch \ --sampling-method "temperature" \ --sampling-temperature 1.5 \ --encoder-langtok "src" \ --decoder-langtok \ --lang-pairs "$LANGPAIRS" \ --criterion label_smoothed_cross_entropy --label-smoothing 0.2 \ --optimizer adam --adam-eps 1e-06 --adam-betas '(0.9, 0.98)' \ --lr-scheduler inverse_sqrt --lr 3e-04 --warmup-updates 2500 --max-update 40000 \ --dropout 0.3 --attention-dropout 0.1 --weight-decay 0.0 \ --max-tokens 1024 --max-tokens-valid 1024 --update-freq 2 \ --save-interval 1 --save-interval-updates 5000 --keep-interval-updates 10 --no-epoch-checkpoints \ --seed 222 --log-format simple --log-interval 50 --ddp-backend=legacy_ddp
The text was updated successfully, but these errors were encountered: