Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation. Pourdamghani, N., Aldarrab, N., Ghazvininejad, M., Knight, K., & May, J. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3057–3062, Florence, Italy, July, 2019. Association for Computational Linguistics.
Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation [link]Paper  doi  abstract   bibtex   
Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then `translating' the resulting pseudo-translation, or `Translationese' into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.
@inproceedings{pourdamghani-etal-2019-translating,
    title = "Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation",
    author = "Pourdamghani, Nima  and
      Aldarrab, Nada  and
      Ghazvininejad, Marjan  and
      Knight, Kevin  and
      May, Jonathan",
    booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
    month = jul,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/P19-1293",
    doi = "10.18653/v1/P19-1293",
    pages = "3057--3062",
    abstract = "Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then {`}translating{'} the resulting pseudo-translation, or {`}Translationese{'} into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.",
}

Downloads: 0