Applications of Statistical Approaches to the Study of Climate Using Large Datasets
Abstract: The increasing volume of climate data from observations, analysis products, and climate model output presents the climate science community with unprecedented data analysis challenges and opportunities. This challenge becomes greater when targeting extreme events as standard data reduction techniques such as multimodel ensemble averaging reduce the magnitude of extremes. In this talk, we will focus on evaluation of regional climate model output using various statistical techniques considering spatial and temporal features. The validation data are from gridded data sets base on observations, and reanalysis data. This research is joint work with many scientists from Argonne and the University of Chicago.
First, we developed spatio-temporal correlations to answer a question — whether high-resolution climate modeling adds value on top of coarse resolution simulation. We have also developed a new algorithm to track the intensity, frequency, and duration of rainstorms, using which we looked at future changes in rainstorms and added value by high-resolution simulation (with convection permitted).
Second, we developed a simple GEV model and a robust approach to estimate the uncertainty of the GEV parameters. We also explored the effects of block length for a block-maxima approach.
Third, we looked at the internal variability of regional climate modeling and demonstrated the robustness of various ensemble members using a resampling technique.
Finally, we initiated a new project using machine learning techniques to determine the parameterizations in the climate model. This will give us opportunities to replace very time-consuming modules with machine learning algorithms. We are also developing hydrological modeling at very high resolution to look at climate change impacts on streamflow and related water resources over the entire continental United States.