Climate-Invariant Machine Learning. Beucler, T., Pritchard, M., Yuval, J., Gupta, A., Peng, L., Rasp, S., Ahmed, F., O'Gorman, P. A., Neelin, J. D., Lutsko, N. J., & Gentine, P. December, 2021. arXiv:2112.08440 [physics]
Paper abstract bibtex Data-driven algorithms, in particular neural networks, can emulate the effects of unresolved processes in coarse-resolution climate models when trained on high-resolution simulation data; however, they often make large generalization errors when evaluated in conditions they were not trained on. Here, we propose to physically rescale the inputs and outputs of machine learning algorithms to help them generalize to unseen climates. Applied to offline parameterizations of subgrid-scale thermodynamics in three distinct climate models, we show that rescaled or "climate-invariant" neural networks make accurate predictions in test climates that are 4K and 8K warmer than their training climates. Additionally, "climate-invariant" neural nets facilitate generalization between Aquaplanet and Earth-like simulations. Through visualization and attribution methods, we show that compared to standard machine learning models, "climate-invariant" algorithms learn more local and robust relations between storm-scale convection, radiation, and their synoptic thermodynamic environment. Overall, these results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency and ability to generalize across climate regimes.
@misc{beucler_climate-invariant_2021,
title = {Climate-{Invariant} {Machine} {Learning}},
url = {http://arxiv.org/abs/2112.08440},
abstract = {Data-driven algorithms, in particular neural networks, can emulate the effects of unresolved processes in coarse-resolution climate models when trained on high-resolution simulation data; however, they often make large generalization errors when evaluated in conditions they were not trained on. Here, we propose to physically rescale the inputs and outputs of machine learning algorithms to help them generalize to unseen climates. Applied to offline parameterizations of subgrid-scale thermodynamics in three distinct climate models, we show that rescaled or "climate-invariant" neural networks make accurate predictions in test climates that are 4K and 8K warmer than their training climates. Additionally, "climate-invariant" neural nets facilitate generalization between Aquaplanet and Earth-like simulations. Through visualization and attribution methods, we show that compared to standard machine learning models, "climate-invariant" algorithms learn more local and robust relations between storm-scale convection, radiation, and their synoptic thermodynamic environment. Overall, these results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency and ability to generalize across climate regimes.},
urldate = {2023-04-18},
publisher = {arXiv},
author = {Beucler, Tom and Pritchard, Michael and Yuval, Janni and Gupta, Ankitesh and Peng, Liran and Rasp, Stephan and Ahmed, Fiaz and O'Gorman, Paul A. and Neelin, J. David and Lutsko, Nicholas J. and Gentine, Pierre},
month = dec,
year = {2021},
note = {arXiv:2112.08440 [physics]},
keywords = {Computer Science - Machine Learning, Physics - Atmospheric and Oceanic Physics, Physics - Computational Physics},
}
Downloads: 0
{"_id":"6cYpGgPp5vyJmBWaZ","bibbaseid":"beucler-pritchard-yuval-gupta-peng-rasp-ahmed-ogorman-etal-climateinvariantmachinelearning-2021","author_short":["Beucler, T.","Pritchard, M.","Yuval, J.","Gupta, A.","Peng, L.","Rasp, S.","Ahmed, F.","O'Gorman, P. A.","Neelin, J. D.","Lutsko, N. J.","Gentine, P."],"bibdata":{"bibtype":"misc","type":"misc","title":"Climate-Invariant Machine Learning","url":"http://arxiv.org/abs/2112.08440","abstract":"Data-driven algorithms, in particular neural networks, can emulate the effects of unresolved processes in coarse-resolution climate models when trained on high-resolution simulation data; however, they often make large generalization errors when evaluated in conditions they were not trained on. Here, we propose to physically rescale the inputs and outputs of machine learning algorithms to help them generalize to unseen climates. Applied to offline parameterizations of subgrid-scale thermodynamics in three distinct climate models, we show that rescaled or \"climate-invariant\" neural networks make accurate predictions in test climates that are 4K and 8K warmer than their training climates. Additionally, \"climate-invariant\" neural nets facilitate generalization between Aquaplanet and Earth-like simulations. Through visualization and attribution methods, we show that compared to standard machine learning models, \"climate-invariant\" algorithms learn more local and robust relations between storm-scale convection, radiation, and their synoptic thermodynamic environment. Overall, these results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency and ability to generalize across climate regimes.","urldate":"2023-04-18","publisher":"arXiv","author":[{"propositions":[],"lastnames":["Beucler"],"firstnames":["Tom"],"suffixes":[]},{"propositions":[],"lastnames":["Pritchard"],"firstnames":["Michael"],"suffixes":[]},{"propositions":[],"lastnames":["Yuval"],"firstnames":["Janni"],"suffixes":[]},{"propositions":[],"lastnames":["Gupta"],"firstnames":["Ankitesh"],"suffixes":[]},{"propositions":[],"lastnames":["Peng"],"firstnames":["Liran"],"suffixes":[]},{"propositions":[],"lastnames":["Rasp"],"firstnames":["Stephan"],"suffixes":[]},{"propositions":[],"lastnames":["Ahmed"],"firstnames":["Fiaz"],"suffixes":[]},{"propositions":[],"lastnames":["O'Gorman"],"firstnames":["Paul","A."],"suffixes":[]},{"propositions":[],"lastnames":["Neelin"],"firstnames":["J.","David"],"suffixes":[]},{"propositions":[],"lastnames":["Lutsko"],"firstnames":["Nicholas","J."],"suffixes":[]},{"propositions":[],"lastnames":["Gentine"],"firstnames":["Pierre"],"suffixes":[]}],"month":"December","year":"2021","note":"arXiv:2112.08440 [physics]","keywords":"Computer Science - Machine Learning, Physics - Atmospheric and Oceanic Physics, Physics - Computational Physics","bibtex":"@misc{beucler_climate-invariant_2021,\n\ttitle = {Climate-{Invariant} {Machine} {Learning}},\n\turl = {http://arxiv.org/abs/2112.08440},\n\tabstract = {Data-driven algorithms, in particular neural networks, can emulate the effects of unresolved processes in coarse-resolution climate models when trained on high-resolution simulation data; however, they often make large generalization errors when evaluated in conditions they were not trained on. Here, we propose to physically rescale the inputs and outputs of machine learning algorithms to help them generalize to unseen climates. Applied to offline parameterizations of subgrid-scale thermodynamics in three distinct climate models, we show that rescaled or \"climate-invariant\" neural networks make accurate predictions in test climates that are 4K and 8K warmer than their training climates. Additionally, \"climate-invariant\" neural nets facilitate generalization between Aquaplanet and Earth-like simulations. Through visualization and attribution methods, we show that compared to standard machine learning models, \"climate-invariant\" algorithms learn more local and robust relations between storm-scale convection, radiation, and their synoptic thermodynamic environment. Overall, these results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency and ability to generalize across climate regimes.},\n\turldate = {2023-04-18},\n\tpublisher = {arXiv},\n\tauthor = {Beucler, Tom and Pritchard, Michael and Yuval, Janni and Gupta, Ankitesh and Peng, Liran and Rasp, Stephan and Ahmed, Fiaz and O'Gorman, Paul A. and Neelin, J. David and Lutsko, Nicholas J. and Gentine, Pierre},\n\tmonth = dec,\n\tyear = {2021},\n\tnote = {arXiv:2112.08440 [physics]},\n\tkeywords = {Computer Science - Machine Learning, Physics - Atmospheric and Oceanic Physics, Physics - Computational Physics},\n}\n\n","author_short":["Beucler, T.","Pritchard, M.","Yuval, J.","Gupta, A.","Peng, L.","Rasp, S.","Ahmed, F.","O'Gorman, P. A.","Neelin, J. D.","Lutsko, N. J.","Gentine, P."],"key":"beucler_climate-invariant_2021","id":"beucler_climate-invariant_2021","bibbaseid":"beucler-pritchard-yuval-gupta-peng-rasp-ahmed-ogorman-etal-climateinvariantmachinelearning-2021","role":"author","urls":{"Paper":"http://arxiv.org/abs/2112.08440"},"keyword":["Computer Science - Machine Learning","Physics - Atmospheric and Oceanic Physics","Physics - Computational Physics"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"misc","biburl":"https://bibbase.org/zotero/iagogv","dataSources":["ovn29uG6Mbp3JWCRR"],"keywords":["computer science - machine learning","physics - atmospheric and oceanic physics","physics - computational physics"],"search_terms":["climate","invariant","machine","learning","beucler","pritchard","yuval","gupta","peng","rasp","ahmed","o'gorman","neelin","lutsko","gentine"],"title":"Climate-Invariant Machine Learning","year":2021}