Cross-lingual Continual Learning. M'hamdi, M., Ren, X., & May, J. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3908–3943, Toronto, Canada, July, 2023. Association for Computational Linguistics. Paper doi abstract bibtex The longstanding goal of multi-lingual learning has been to develop a universal cross-lingual model that can withstand the changes in multi-lingual data distributions. There has been a large amount of work to adapt such multi-lingual models to unseen target languages. However, the majority of work in this direction focuses on the standard one-hop transfer learning pipeline from source to target languages, whereas in realistic scenarios, new languages can be incorporated at any time in a sequential manner. In this paper, we present a principled Cross-lingual Continual Learning (CCL) evaluation paradigm, where we analyze different categories of approaches used to continually adapt to emerging data from different languages. We provide insights into what makes multilingual sequential learning particularly challenging. To surmount such challenges, we benchmark a representative set of cross-lingual continual learning algorithms and analyze their knowledge preservation, accumulation, and generalization capabilities compared to baselines on carefully curated datastreams. The implications of this analysis include a recipe for how to measure and balance different cross-lingual continual learning desiderata, which go beyond conventional transfer learning.
@inproceedings{mhamdi-etal-2023-cross,
title = "Cross-lingual Continual Learning",
author = "M{'}hamdi, Meryem and
Ren, Xiang and
May, Jonathan",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.217",
doi = "10.18653/v1/2023.acl-long.217",
pages = "3908--3943",
abstract = "The longstanding goal of multi-lingual learning has been to develop a universal cross-lingual model that can withstand the changes in multi-lingual data distributions. There has been a large amount of work to adapt such multi-lingual models to unseen target languages. However, the majority of work in this direction focuses on the standard one-hop transfer learning pipeline from source to target languages, whereas in realistic scenarios, new languages can be incorporated at any time in a sequential manner. In this paper, we present a principled Cross-lingual Continual Learning (CCL) evaluation paradigm, where we analyze different categories of approaches used to continually adapt to emerging data from different languages. We provide insights into what makes multilingual sequential learning particularly challenging. To surmount such challenges, we benchmark a representative set of cross-lingual continual learning algorithms and analyze their knowledge preservation, accumulation, and generalization capabilities compared to baselines on carefully curated datastreams. The implications of this analysis include a recipe for how to measure and balance different cross-lingual continual learning desiderata, which go beyond conventional transfer learning.",
}
Downloads: 0
{"_id":"WfhSY5WggSDihhFQ2","bibbaseid":"mhamdi-ren-may-crosslingualcontinuallearning-2023","author_short":["M'hamdi, M.","Ren, X.","May, J."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","title":"Cross-lingual Continual Learning","author":[{"propositions":[],"lastnames":["M'hamdi"],"firstnames":["Meryem"],"suffixes":[]},{"propositions":[],"lastnames":["Ren"],"firstnames":["Xiang"],"suffixes":[]},{"propositions":[],"lastnames":["May"],"firstnames":["Jonathan"],"suffixes":[]}],"booktitle":"Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)","month":"July","year":"2023","address":"Toronto, Canada","publisher":"Association for Computational Linguistics","url":"https://aclanthology.org/2023.acl-long.217","doi":"10.18653/v1/2023.acl-long.217","pages":"3908–3943","abstract":"The longstanding goal of multi-lingual learning has been to develop a universal cross-lingual model that can withstand the changes in multi-lingual data distributions. There has been a large amount of work to adapt such multi-lingual models to unseen target languages. However, the majority of work in this direction focuses on the standard one-hop transfer learning pipeline from source to target languages, whereas in realistic scenarios, new languages can be incorporated at any time in a sequential manner. In this paper, we present a principled Cross-lingual Continual Learning (CCL) evaluation paradigm, where we analyze different categories of approaches used to continually adapt to emerging data from different languages. We provide insights into what makes multilingual sequential learning particularly challenging. To surmount such challenges, we benchmark a representative set of cross-lingual continual learning algorithms and analyze their knowledge preservation, accumulation, and generalization capabilities compared to baselines on carefully curated datastreams. The implications of this analysis include a recipe for how to measure and balance different cross-lingual continual learning desiderata, which go beyond conventional transfer learning.","bibtex":"@inproceedings{mhamdi-etal-2023-cross,\n title = \"Cross-lingual Continual Learning\",\n author = \"M{'}hamdi, Meryem and\n Ren, Xiang and\n May, Jonathan\",\n booktitle = \"Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)\",\n month = jul,\n year = \"2023\",\n address = \"Toronto, Canada\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://aclanthology.org/2023.acl-long.217\",\n doi = \"10.18653/v1/2023.acl-long.217\",\n pages = \"3908--3943\",\n abstract = \"The longstanding goal of multi-lingual learning has been to develop a universal cross-lingual model that can withstand the changes in multi-lingual data distributions. There has been a large amount of work to adapt such multi-lingual models to unseen target languages. However, the majority of work in this direction focuses on the standard one-hop transfer learning pipeline from source to target languages, whereas in realistic scenarios, new languages can be incorporated at any time in a sequential manner. In this paper, we present a principled Cross-lingual Continual Learning (CCL) evaluation paradigm, where we analyze different categories of approaches used to continually adapt to emerging data from different languages. We provide insights into what makes multilingual sequential learning particularly challenging. To surmount such challenges, we benchmark a representative set of cross-lingual continual learning algorithms and analyze their knowledge preservation, accumulation, and generalization capabilities compared to baselines on carefully curated datastreams. The implications of this analysis include a recipe for how to measure and balance different cross-lingual continual learning desiderata, which go beyond conventional transfer learning.\",\n}\n\n","author_short":["M'hamdi, M.","Ren, X.","May, J."],"key":"mhamdi-etal-2023-cross","id":"mhamdi-etal-2023-cross","bibbaseid":"mhamdi-ren-may-crosslingualcontinuallearning-2023","role":"author","urls":{"Paper":"https://aclanthology.org/2023.acl-long.217"},"metadata":{"authorlinks":{}}},"bibtype":"inproceedings","biburl":"https://jonmay.github.io/webpage/cutelabname/cutelabname.bib","dataSources":["j3Qzx9HAAC6WtJDHS","5eM3sAccSEpjSDHHQ"],"keywords":[],"search_terms":["cross","lingual","continual","learning","m'hamdi","ren","may"],"title":"Cross-lingual Continual Learning","year":2023,"downloads":1}