TY - GEN
T1 - A Novel Deep Learning Approach to the Statistical Downscaling of Temperatures for Monitoring Climate Change
AU - Gerges, Firas
AU - Boufadel, Michel C.
AU - Bou-Zeid, Elie
AU - Nassif, Hani
AU - Wang, Jason T.L.
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/1/15
Y1 - 2022/1/15
N2 - General Circulation Models (GCMs) allow for the simulation of several climate variables through the year 2100. GCM simulations, however, are too coarse to monitor climate change at a local scale in a local region. Hence, one needs to perform spatial downscaling for these simulations, where statistical downscaling is often used. Statistical downscaling is performed by utilizing the large-scale GCM outputs to forecast a local-scale field (e.g., temperature). In this paper, we develop a new deep learning approach, named AIG-TRANSFORMER, which employs a novel attention-based input grouping (AIG) neural network followed by a transformer, for the statistical downscaling of the weekly averages of maximum (Tmax) and minimum (Tmin) temperatures using GCM-simulated climatic fields (climate variables). We formulate the downscaling problem as a multivariate time series forecasting task, with multiple GCM-simulated climatic fields as input features. We employ an attention mechanism within the AIG network to give selective importance to the input features while reducing the size of the input fed to the transformer. To test AIG-TRANSFORMER, we perform the statistical downscaling over the Hackensack-Passaic Watershed, in northeast New Jersey. We compare our new deep learning approach against several existing machine learning methods including random forests, support vector regression and long short-term memory networks. Experimental results show that AIG-TRANSFORMER outperforms the existing methods for downscaling both the maximum and minimum temperatures, with a Nash-Sutcliffe Efficiency coefficient of 0.84 for Tmax, and 0.85 for Tmin. We further apply AIG-TRANSFORMER to produce long-term projections over the 20 years period from 2030 to 2049, and report the annual means for maximum and minimum temperatures.
AB - General Circulation Models (GCMs) allow for the simulation of several climate variables through the year 2100. GCM simulations, however, are too coarse to monitor climate change at a local scale in a local region. Hence, one needs to perform spatial downscaling for these simulations, where statistical downscaling is often used. Statistical downscaling is performed by utilizing the large-scale GCM outputs to forecast a local-scale field (e.g., temperature). In this paper, we develop a new deep learning approach, named AIG-TRANSFORMER, which employs a novel attention-based input grouping (AIG) neural network followed by a transformer, for the statistical downscaling of the weekly averages of maximum (Tmax) and minimum (Tmin) temperatures using GCM-simulated climatic fields (climate variables). We formulate the downscaling problem as a multivariate time series forecasting task, with multiple GCM-simulated climatic fields as input features. We employ an attention mechanism within the AIG network to give selective importance to the input features while reducing the size of the input fed to the transformer. To test AIG-TRANSFORMER, we perform the statistical downscaling over the Hackensack-Passaic Watershed, in northeast New Jersey. We compare our new deep learning approach against several existing machine learning methods including random forests, support vector regression and long short-term memory networks. Experimental results show that AIG-TRANSFORMER outperforms the existing methods for downscaling both the maximum and minimum temperatures, with a Nash-Sutcliffe Efficiency coefficient of 0.84 for Tmax, and 0.85 for Tmin. We further apply AIG-TRANSFORMER to produce long-term projections over the 20 years period from 2030 to 2049, and report the annual means for maximum and minimum temperatures.
KW - Climate Change
KW - Deep Learning
KW - Statistical Downscaling
KW - Transformer
UR - http://www.scopus.com/inward/record.url?scp=85128721236&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85128721236&partnerID=8YFLogxK
U2 - 10.1145/3523150.3523151
DO - 10.1145/3523150.3523151
M3 - Conference contribution
AN - SCOPUS:85128721236
T3 - ACM International Conference Proceeding Series
SP - 1
EP - 7
BT - ICMLSC 2022 - Proceedings of the 2022 6th International Conference on Machine Learning and Soft Computing
PB - Association for Computing Machinery
T2 - 6th International Conference on Machine Learning and Soft Computing, ICMLSC 2022
Y2 - 15 January 2022 through 17 January 2022
ER -