![]() The findings of this study imply that these MT applications can be implemented to perform English into Arabic translation to get the broad gist of a source text, but a deep and thorough post-editing process looks essential for a full and accurate understanding of an English into Arabic MT output. However, the quality of Ginger’s translation was found slightly less accurate than that of Microsoft Bing, but remarkably better than Google’s translation. Although the translation of Microsoft Bing was rated the best, Google’s translation was found the least accurate due to the high percentage of fidelity and intelligibility errors detected in its translation output. Results of this evaluation process revealed that none of the three translated versions of the source text was perfectly translated. ![]() To perform the quantitative description of the evaluation process, the numbers of fidelity and intelligibility errors and their percentages were calculated. Depending on the Arabic counterpart model text, each chunk was rated as “correct or incorrect” at the two levels of the translation attributes: fidelity and intelligibility. The English source text was segmented into 84 semantic chunks. To carry this evaluation of the machine translation (MT) outputs, an English text and its Arabic counterpart were selected from the UN records. This study compares the translation outputs of an English into Arabic text using the three machine translators of Google Translate, Microsoft Bing, and Ginger.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |