SD-GCM Tool

  1. Rainfed wheat (Triticum aestivum L.) yield prediction using economical, meteorological, and drought indicators through pooled panel data and statistical downscaling
  2. Comparing the Performance of Dynamical and Statistical Downscaling on Historical Run Precipitation Data over a Semi-Arid Region
  3. Prediction of climate variables by comparing the k-nearest neighbor method and MIROC5 outputs in an arid environment
  4. Prediction of effective climate change indicators using statistical downscaling approach and impact assessment on pearl millet (Pennisetum glaucum L.) yield through Genetic Algorithm in Punjab, Pakistan

license

You can access sample input files for download here: DropBox


What is SD GCM software?

The SD GCM (Statistical Downscaling of General Circulation Models) is a valuable software package designed to meet the needs of researchers and users who require an efficient tool for using statistical downscaling methods. This software is particularly effective for downscaling CMIP5 models based on Representative Concentration Pathways (RCPs) Scenarios.

Within the SD GCM tool, users have access to multiple statistical downscaling (SD) methods. Specifically, there are three prominent statistical downscaling models available:

  • The Delta method
  • The Quantile Mapping (QM) method (Panofsky and Briar, 1968)
  • The Empirical Quantile Mapping (EQM) method (Boe et al., 2007)

These methods enable users to perform accurate downscaling of climate data, facilitating in-depth analysis and exploration of various climate scenarios. The SD GCM tool stands as a valuable resource for climate researchers and users seeking precise statistical downscaling capabilities.

We have recently introduced a new icon to the tool, allowing you to incorporate discrete wavelet analysis into the existing methods. To better understand how this feature works, please refer to the flowchart below:

Wavelet Downscaling

With this new addition, you can enhance the capabilities of the tool and further explore the possibilities of statistical downscaling with discrete wavelet analysis. We hope it brings even more value to your research and analysis endeavors.

To perform downscaling, three essential datasets are required: observation (station) data, historical data, and RCPs/SSPs data. However, if your focus is on evaluating GCM (General Circulation Models), you will only need observation and historical data.

Within this comprehensive package, users have the flexibility to work with a wide range of datasets. You can effortlessly read and utilize any CMIP5/CMIP6 model under various RCP/SSP scenarios or any Gridded Dataset in NetCDF file format. Additionally, the tool allows you to input your own gridded data in Excel format, providing even greater versatility in your research and analysis.

The SD GCM software offers a unique option for evaluating model data, which is available in the unregistered version as well. During this evaluation phase, users can assess the performance of CMIP5/CMIP6 models by comparing them with observation data over a shared time period.

For conducting the evaluation, the observation data should be provided in an Excel format file, while the historical data of the models must be in NetCDF format. The evaluation process employs six efficiency criteria to gauge the model's accuracy:

  • Pearson Correlation
  • Nash-Sutcliffe efficiency
  • Spearman Correlation
  • RMSE (Root Mean Squared Error)
  • d (index of agreement)
  • MAE (Mean Absolute Error)

By utilizing these criteria, researchers can effectively analyze and quantify the methods model's performance against observed data, gaining valuable insights into the model's capabilities and limitations. The evaluation phase empowers users to make informed decisions and interpretations, fostering robust research in climate and weather analysis.

The SD-GCM V2.0 introduces enhanced capabilities, enabling seamless compatibility with both monthly and daily(and hourly with limitation) CMIP5 or CMIP6 data, as well as CORDEX data. With this version, we have made significant efforts to ensure that the tool can efficiently handle any NetCDF file. Whether you are working with climate data from CMIP5, CMIP6, or CORDEX projects, you can confidently utilize the SD-GCM V2.0 to perform statistical downscaling with ease. This improved version empowers researchers and users to access and process a wide range of NetCDF files, facilitating more comprehensive and accurate climate analysis and impact assessments.

Here are some of the key abilities and features offered by the tool:

  1. Ease of Use: The tool is designed with a user-friendly interface, making it accessible to researchers and users with varying levels of expertise.
  2. Statistical Downscaling: The tool enables statistical downscaling of GCM data, allowing users to refine and localize climate projections for more specific regions or locations. You can do the downscaling project for a list of stations in one run. For example, you can do it for 200 stations but the speed of extraction will depend on your CPU or RAM.
  3. Monthly and Daily Data Support: It can efficiently process both monthly and daily CMIP5, CMIP6, and CORDEX data, catering to various temporal resolution needs.
  4. You can input historical and RCPs(or SSPs) scenarios data with excel and NetCDF format
  5. Wide NetCDF File Compatibility: The tool is designed to work with a wide range of NetCDF files, providing flexibility in data sources and formats.
  6. Wavelet Analysis Integration: The latest version allows users to incorporate wavelet analysis into the existing downscaling methods, expanding analysis possibilities.
  7. You can draw BoxPlot, CDF, PDF based on distributions such as Normal, Gamma,...
  8. You can count values in the output data for extreme values analysis. For example, wet days and Tmax < val
  9. With the unregistered version of SD GCM, you can evaluate the methods( Delta, QM, and EQM) but you can not save the result data. For complete using of SD GCM, you should buy a license key.
  10. You should load the GCM data continuously.

Bias Correction Tool for CMIP6 Data | Example on CanESM5 Model




Statistical downscaling methods employed within the SD GCM Tool

The SD GCM tool offers three distinct methods of statistical downscaling, each with its unique approach and benefits:

Delta statistical downscaling method (Dessu and Melesse, 2013):

Delta Method: The Delta method is one of the downscaling techniques available in the SD GCM tool. It involves calculating the difference (delta) between the GCM model data and observed data over a specific historical period. This delta is then applied to the GCM projections to adjust and refine the climate data, providing more accurate and localized information.

As presented in Eq. 1 and Eq. 2 the precipitation and temperature of GCM data are downscaled

Statistical downscaling-Delta

where, P(SD,Delta) and T(SD,Delta) are downscaled data of precipitation and temperature, respectively. P(Obs) refers the average observed and P(GCMhist) represents to GCM mean simulation historical data of precipitation. Subscript GCMrcp represents the GCM's RCP outputs over the future period and subscript Obs represents the observation values. In Eq. 2 all the subscripts are the same as Eq. 1 for temperature.


Statistical Downscaling and Bias Correction with SD-GCM V2.0



Quantile Mapping (QM) statistical downscaling method

QM (Quantile Mapping) Method: The Quantile Mapping method, another option in the SD GCM tool, works by mapping the cumulative distribution functions (CDFs) of the GCM model data and observed data. By matching the quantiles of the two datasets, the method adjusts the GCM projections to better align with the observed data's statistical characteristics. This process leads to improved downscaled climate information.

As Panofsky and Brier (1968) revealed, QM is a statistical downscaling method that has been used in different field of studies. In the equation of QM, the modelled probabilistic distribution to observed probabilistic distribution is calculated. This concept is computed as Eq. 3 for precipitation data. SD GCM uses Eq 3-1 for evaluation and Eq 3-2 for future downscaling.

Statistical downscaling-QM

In the presented equation (Eq. 3), CDF is the cumulative distribution function of the observation and GCM data, over considering period.

Empirical Quantile Mapping (EQM) statistical downscaling method

EQM (Empirical Quantile Mapping) Method: The Empirical Quantile Mapping method is a variant of the traditional QM method and is also available in the SD GCM tool. Similar to QM, it involves matching the CDFs of GCM and observed data. However, EQM introduces additional corrections to the data, making it more robust and accurate, particularly in extreme weather events.

Wetterhall and his colleagues (2012), have published a complete paper for statistical downscaling methods, as EQM. The EQM employs the empirical cumulative distribution function (ECDF) as Eq. 4, and all the items are the same as Eq.1. SD GCM uses Eq 4-1 for evaluation and Eq 4-2 for future downscaling.

Statistical downscaling-EQM

What is the downscaling concept?

The raw outputs obtained from General Circulation Model (GCM) simulations may not be sufficient for accurately assessing the impact of hydrological, agricultural, and other studies. This limitation arises due to the inadequate and overly coarse spatial scale of GCMs outputs, typically around 250 km. To overcome this challenge, scientists employ various methods, with downscaling being a key solution. Downscaling serves as a bridge between coarse and fine-scale climatic data, enabling more precise analysis and projections.

Downscaling can be carried out on both spatial and temporal aspects of climate projections. Spatial downscaling specifically focuses on extracting finer-resolution spatial climate information from the coarser-resolution GCM output. For example, it allows the conversion of GCM output with 500 kilometers grid cells into a 20 kilometers resolution or even to a specific location.

By using spatial downscaling techniques, researchers can obtain more localized and detailed climate data, which is vital for enhancing the accuracy and relevance of various studies related to climate impact assessments, resource management, and planning in diverse fields.

There are broadly two types of downscaling: Dynamical Downscaling (DD) and Statistical Downscaling (SD) (Christensen et al., 2007). DD, nesting a fine scale climate model in a coarse-scale model, produces spatially complete fields of climate variables. DD is very computationally intensive, making its use in impact studies limited, and essentially impossible for multi-decade simulations. DD models are very complex and require substantial computational resources, often at the same level as required for GCM simulations. The implementation of these models is prone to error. DD involves a regional climate model (RCM) used to model the target region at finer scales bounded by larger GCM nodes.

SD (Statistical Downscaling) methods don’t require significant computational resources and they can easily run by a simple computer with relying on simple regression analyses in a flash time, so, due to the ease of their implementation, these methods have a high opportunity that will select from related users. A vast number of techniques have been developed for SD and these are based on the determination of statistical relations between large-scale synoptic predictors and local observations from ground stations. SD can produce site-specific climate projections, which DD cannot provide since they are computationally limited to a 20–50 kilometers spatial resolution. One advantage of SD techniques is that they are less computationally intensive and hence can be used to downscale many GCM (or RCM) climate projections. Furthermore, compared to DD methods, the SD method is relatively easy to use and provides station-scale climate information from GCM-scale output.

In general, the statistical methods can be divided into three categories: regression (transfer function) method (e.g. Kang et al. 2007), stochastic weather generator (Richardson 1981) and weather pattern schemes. There is a numerous number of statistical downscaling methods. One of the most popular and common of them is Bias Correction (BC) that has been applied extensively for impact assessment and employed in climate change studies in all over the world (Wood et al., 2002; Payne et al. 2004). One of the best references for review of different kind of BC approaches can be found in Themeßl et al. (2012).

The license of this tool is applicable for one year of using and you can renew it by pay 20% of the price for the new year.

Subscribe to our Channel In YouTube: