X-METRA-ADA: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering. M'hamdi, M., Kim, D. S., Dernoncourt, F., Bui, T., Ren, X., & May, J. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3617–3632, Online, June, 2021. Association for Computational Linguistics. Paper abstract bibtex Multilingual models, such as M-BERT and XLM-R, have gained increasing popularity, due to their zero-shot cross-lingual transfer learning capabilities. However, their generalization ability is still inconsistent for typologically diverse languages and across different benchmarks. Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding (NLU). In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU. Our approach adapts MAML, an optimization-based meta-learning approach, to learn to adapt to new languages. We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering. We show that our approach outperforms naive fine-tuning, reaching competitive performance on both tasks for most languages. Our analysis reveals that X-METRA-ADA can leverage limited data for faster adaptation.
@inproceedings{mhamdi-etal-2021-x,
title = "{X}-{METRA}-{ADA}: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering",
author = "M{'}hamdi, Meryem and
Kim, Doo Soon and
Dernoncourt, Franck and
Bui, Trung and
Ren, Xiang and
May, Jonathan",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.naacl-main.283",
pages = "3617--3632",
abstract = "Multilingual models, such as M-BERT and XLM-R, have gained increasing popularity, due to their zero-shot cross-lingual transfer learning capabilities. However, their generalization ability is still inconsistent for typologically diverse languages and across different benchmarks. Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding (NLU). In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU. Our approach adapts MAML, an optimization-based meta-learning approach, to learn to adapt to new languages. We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering. We show that our approach outperforms naive fine-tuning, reaching competitive performance on both tasks for most languages. Our analysis reveals that X-METRA-ADA can leverage limited data for faster adaptation.",
}
Downloads: 0
{"_id":"MnKpPrGdH6qp9tSua","bibbaseid":"mhamdi-kim-dernoncourt-bui-ren-may-xmetraadacrosslingualmetatransferlearningadaptationtonaturallanguageunderstandingandquestionanswering-2021","author_short":["M'hamdi, M.","Kim, D. S.","Dernoncourt, F.","Bui, T.","Ren, X.","May, J."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"X-METRA-ADA: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering","author":[{"propositions":[],"lastnames":["M'hamdi"],"firstnames":["Meryem"],"suffixes":[]},{"propositions":[],"lastnames":["Kim"],"firstnames":["Doo","Soon"],"suffixes":[]},{"propositions":[],"lastnames":["Dernoncourt"],"firstnames":["Franck"],"suffixes":[]},{"propositions":[],"lastnames":["Bui"],"firstnames":["Trung"],"suffixes":[]},{"propositions":[],"lastnames":["Ren"],"firstnames":["Xiang"],"suffixes":[]},{"propositions":[],"lastnames":["May"],"firstnames":["Jonathan"],"suffixes":[]}],"booktitle":"Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies","month":"June","year":"2021","address":"Online","publisher":"Association for Computational Linguistics","url":"https://www.aclweb.org/anthology/2021.naacl-main.283","pages":"3617–3632","abstract":"Multilingual models, such as M-BERT and XLM-R, have gained increasing popularity, due to their zero-shot cross-lingual transfer learning capabilities. However, their generalization ability is still inconsistent for typologically diverse languages and across different benchmarks. Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding (NLU). In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU. Our approach adapts MAML, an optimization-based meta-learning approach, to learn to adapt to new languages. We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering. We show that our approach outperforms naive fine-tuning, reaching competitive performance on both tasks for most languages. Our analysis reveals that X-METRA-ADA can leverage limited data for faster adaptation.","bibtex":"@inproceedings{mhamdi-etal-2021-x,\n title = \"{X}-{METRA}-{ADA}: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering\",\n author = \"M{'}hamdi, Meryem and\n Kim, Doo Soon and\n Dernoncourt, Franck and\n Bui, Trung and\n Ren, Xiang and\n May, Jonathan\",\n booktitle = \"Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies\",\n month = jun,\n year = \"2021\",\n address = \"Online\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://www.aclweb.org/anthology/2021.naacl-main.283\",\n pages = \"3617--3632\",\n abstract = \"Multilingual models, such as M-BERT and XLM-R, have gained increasing popularity, due to their zero-shot cross-lingual transfer learning capabilities. However, their generalization ability is still inconsistent for typologically diverse languages and across different benchmarks. Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding (NLU). In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU. Our approach adapts MAML, an optimization-based meta-learning approach, to learn to adapt to new languages. We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering. We show that our approach outperforms naive fine-tuning, reaching competitive performance on both tasks for most languages. Our analysis reveals that X-METRA-ADA can leverage limited data for faster adaptation.\",\n}\n\n\n","author_short":["M'hamdi, M.","Kim, D. S.","Dernoncourt, F.","Bui, T.","Ren, X.","May, J."],"key":"mhamdi-etal-2021-x","id":"mhamdi-etal-2021-x","bibbaseid":"mhamdi-kim-dernoncourt-bui-ren-may-xmetraadacrosslingualmetatransferlearningadaptationtonaturallanguageunderstandingandquestionanswering-2021","role":"author","urls":{"Paper":"https://www.aclweb.org/anthology/2021.naacl-main.283"},"metadata":{"authorlinks":{}}},"bibtype":"inproceedings","biburl":"https://jonmay.github.io/webpage/cutelabname/cutelabname.bib","dataSources":["BnZgtH7HDESgbxKxt","hbZSwot2msWk92m5B","TuE5hi7j4WXXPH7Ri","fcWjcoAgajPvXWcp7","GvHfaAWP6AfN6oLQE","j3Qzx9HAAC6WtJDHS","5eM3sAccSEpjSDHHQ"],"keywords":[],"search_terms":["metra","ada","cross","lingual","meta","transfer","learning","adaptation","natural","language","understanding","question","answering","m'hamdi","kim","dernoncourt","bui","ren","may"],"title":"X-METRA-ADA: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering","year":2021}