opus1m-2021-05-15.zip dataset: opus1m model: transformer-align source language(s): afb apc ara arq arz target language(s): deu model: transformer-align pre-processing: normalization + SentencePiece (spm32k,spm32k) download: opus1m-2021-05-15.zip test set translations: opus1m-2021-05-15.test.txt test set scores: opus1m-2021-05-15.eval.txt Benchmarks testset BLEU chr-F #sent #words BP Tatoeba-test.afb-deu 38.1 0.605 5 32 1.000 Tatoeba-test.apc-deu 38.0 0.600 2 11 1.000 Tatoeba-test.ara-deu 37.2 0.563 1209 8370 0.967 Tatoeba-test.arq-deu 35.0 0.688 1 6 0.819 Tatoeba-test.arz-deu 29.7 0.468 2 13 1.000 Tatoeba-test.heb-deu 0.0 0.104 3090 25098 1.000 Tatoeba-test.kab-deu 0.1 0.063 334 2107 1.000 Tatoeba-test.mlt-deu 0.5 0.060 65 269 1.000 Tatoeba-test.multi-deu 37.4 0.566 1208 8360 0.968 Tatoeba-test.tmr-deu 1.7 0.103 2 10 1.000