Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation. Pourdamghani, N., Aldarrab, N., Ghazvininejad, M., Knight, K., & May, J. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3057–3062, Florence, Italy, July, 2019. Association for Computational Linguistics. Paper doi abstract bibtex Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then `translating' the resulting pseudo-translation, or `Translationese' into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.
@inproceedings{pourdamghani-etal-2019-translating,
title = "Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation",
author = "Pourdamghani, Nima and
Aldarrab, Nada and
Ghazvininejad, Marjan and
Knight, Kevin and
May, Jonathan",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1293",
doi = "10.18653/v1/P19-1293",
pages = "3057--3062",
abstract = "Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then {`}translating{'} the resulting pseudo-translation, or {`}Translationese{'} into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.",
}
Downloads: 0
{"_id":"oBuvocpxHNYemc6aW","bibbaseid":"pourdamghani-aldarrab-ghazvininejad-knight-may-translatingtranslationeseatwostepapproachtounsupervisedmachinetranslation-2019","author_short":["Pourdamghani, N.","Aldarrab, N.","Ghazvininejad, M.","Knight, K.","May, J."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation","author":[{"propositions":[],"lastnames":["Pourdamghani"],"firstnames":["Nima"],"suffixes":[]},{"propositions":[],"lastnames":["Aldarrab"],"firstnames":["Nada"],"suffixes":[]},{"propositions":[],"lastnames":["Ghazvininejad"],"firstnames":["Marjan"],"suffixes":[]},{"propositions":[],"lastnames":["Knight"],"firstnames":["Kevin"],"suffixes":[]},{"propositions":[],"lastnames":["May"],"firstnames":["Jonathan"],"suffixes":[]}],"booktitle":"Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics","month":"July","year":"2019","address":"Florence, Italy","publisher":"Association for Computational Linguistics","url":"https://www.aclweb.org/anthology/P19-1293","doi":"10.18653/v1/P19-1293","pages":"3057–3062","abstract":"Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then `translating' the resulting pseudo-translation, or `Translationese' into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.","bibtex":"@inproceedings{pourdamghani-etal-2019-translating,\n title = \"Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation\",\n author = \"Pourdamghani, Nima and\n Aldarrab, Nada and\n Ghazvininejad, Marjan and\n Knight, Kevin and\n May, Jonathan\",\n booktitle = \"Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics\",\n month = jul,\n year = \"2019\",\n address = \"Florence, Italy\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://www.aclweb.org/anthology/P19-1293\",\n doi = \"10.18653/v1/P19-1293\",\n pages = \"3057--3062\",\n abstract = \"Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then {`}translating{'} the resulting pseudo-translation, or {`}Translationese{'} into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.\",\n}\n\n","author_short":["Pourdamghani, N.","Aldarrab, N.","Ghazvininejad, M.","Knight, K.","May, J."],"key":"pourdamghani-etal-2019-translating","id":"pourdamghani-etal-2019-translating","bibbaseid":"pourdamghani-aldarrab-ghazvininejad-knight-may-translatingtranslationeseatwostepapproachtounsupervisedmachinetranslation-2019","role":"author","urls":{"Paper":"https://www.aclweb.org/anthology/P19-1293"},"metadata":{"authorlinks":{}}},"bibtype":"inproceedings","biburl":"https://jonmay.github.io/webpage/cutelabname/cutelabname.bib","dataSources":["ZdhKtP2cSp3Aki2ge","X5WBAKQabka5TW5z7","BnZgtH7HDESgbxKxt","hbZSwot2msWk92m5B","fcWjcoAgajPvXWcp7","GvHfaAWP6AfN6oLQE","j3Qzx9HAAC6WtJDHS","5eM3sAccSEpjSDHHQ"],"keywords":[],"search_terms":["translating","translationese","two","step","approach","unsupervised","machine","translation","pourdamghani","aldarrab","ghazvininejad","knight","may"],"title":"Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation","year":2019}