Global climate models (GCM) are fundamental tools for weather forecasting and climate predictions at different time scales, from intraseasonal prediction to climate change projections. Their design allows GCMs to simulate the global climate adequately, but they are not able to skillfully simulate local/regional climates. Consequently, downscaling and bias correction methods are increasingly needed and applied for generating useful local and regional climate information from the coarse GCM resolution.
Empirical-statistical downscaling (ESD) methods generate climate information at the local scale or with a greater resolution than that achieved by GCM by means of empirical or statistical relationships between large-scale atmospheric variables and the local observed climate. As a counterpart approach, dynamical downscaling is based on regional climate models that simulate regional climate processes with a greater spatial resolution, using GCM fields as initial or boundary conditions.
Various ESD methods can be classified according to different criteria, depending on their approach, implementation, and application. In general terms, ESD methods can be categorized into subgroups that include transfer functions or regression models (either linear or nonlinear), weather generators, and weather typing methods and analogs. Although these methods can be grouped into different categories, they can also be combined to generate more sophisticated downscaling methods. In the last group, weather typing and analogs, the methods relate the occurrence of particular weather classes to local and regional weather conditions. In particular, the analog method is based on finding atmospheric states in the historical record that are similar to the atmospheric state on a given target day. Then, the corresponding historical local weather conditions are used to estimate local weather conditions on the target day.
The analog method is a relatively simple technique that has been extensively used as a benchmark method in statistical downscaling applications. Of easy construction and applicability to any predictand variable, it has shown to perform as well as other more sophisticated methods. These attributes have inspired its application in diverse studies around the world that explore its ability to simulate different characteristics of regional climates.
Article
C.J.C. Reason
Southern Africa extends from the equator to about 34°S and is essentially a narrow, peninsular land mass bordered to its south, west, and east by oceans. Its termination in the mid-ocean subtropics has important consequences for regional climate, since it allows the strongest western boundary current in the world ocean (warm Agulhas Current) to be in close proximity to an intense eastern boundary upwelling current (cold Benguela Current). Unlike other western boundary currents, the Agulhas retroflects south of the land mass and flows back into the South Indian Ocean, thereby leading to a large area of anomalously warm water south of South Africa which may influence storm development over the southern part of the land mass. Two other unique regional ocean features imprint on the climate of southern Africa—the Angola-Benguela Frontal Zone (ABFZ) and the Seychelles-Chagos thermocline ridge (SCTR). The former is important for the development of Benguela Niños and flood events over southwestern Africa, while the SCTR influences Madden-Julian Oscillation and tropical cyclone activity in the western Indian Ocean. In addition to South Atlantic and South Indian Ocean influences, there are climatic implications of the neighboring Southern Ocean.
Along with Benguela Niños, the southern African climate is strongly impacted by ENSO and to lesser extent by the Southern Annular Mode (SAM) and sea-surface temperature (SST) dipole events in the Indian and South Atlantic Oceans. The regional land–sea distribution leads to a highly variable climate on a range of scales that is still not well understood due to its complexity and its sensitivity to a number of different drivers. Strong and variable gradients in surface characteristics exist not only in the neighboring oceans but also in several aspects of the land mass, and these all influence the regional climate and its interactions with climate modes of variability.
Much of the interior of southern Africa consists of a plateau 1 to 1.5 km high and a narrow coastal belt that is particularly mountainous in South Africa, leading to sharp topographic gradients. The topography is able to influence the track and development of many weather systems, leading to marked gradients in rainfall and vegetation across southern Africa.
The presence of the large island of Madagascar, itself a region of strong topographic and rainfall gradients, has consequences for the climate of the mainland by reducing the impact of the moist trade winds on the Mozambique coast and the likelihood of tropical cyclone landfall there. It is also likely that at least some of the relativity aridity of the Limpopo region in northern South Africa/southern Zimbabwe results from the location of Madagascar in the southwestern Indian Ocean.
While leading to challenges in understanding its climate variability and change, the complex geography of southern Africa offers a very useful test bed for improving the global models used in many institutions for climate prediction. Thus, research into the relative shortcomings of the models in the southern African region may lead not only to better understanding of southern African climate but also to enhanced capability to predict climate globally.
Article
Frauke Feser
Storms are characterized by high wind speeds; often large precipitation amounts in the form of rain, freezing rain, or snow; and thunder and lightning (for thunderstorms). Many different types exist, ranging from tropical cyclones and large storms of the midlatitudes to small polar lows, Medicanes, thunderstorms, or tornadoes. They can lead to extreme weather events like storm surges, flooding, high snow quantities, or bush fires. Storms often pose a threat to human lives and property, agriculture, forestry, wildlife, ships, and offshore and onshore industries. Thus, it is vital to gain knowledge about changes in storm frequency and intensity. Future storm predictions are important, and they depend to a great extent on the evaluation of changes in wind statistics of the past.
To obtain reliable statistics, long and homogeneous time series over at least some decades are needed. However, wind measurements are frequently influenced by changes in the synoptic station, its location or surroundings, instruments, and measurement practices. These factors deteriorate the homogeneity of wind records. Storm indexes derived from measurements of sea-level pressure are less prone to such changes, as pressure does not show very much spatial variability as wind speed does. Long-term historical pressure measurements exist that enable us to deduce changes in storminess for more than the last 140 years. But storm records are not just compiled from measurement data; they also may be inferred from climate model data.
The first numerical weather forecasts were performed in the 1950s. These served as a basis for the development of atmospheric circulation models, which were the first generation of climate models or general-circulation models. Soon afterward, model data was analyzed for storm events and cyclone-tracking algorithms were programmed. Climate models nowadays have reached high resolution and reliability and can be run not just for the past, but also for future emission scenarios which return possible future storm activity.
Article
Rasmus Benestad
What are the local consequences of a global climate change? This question is important for proper handling of risks associated with weather and climate. It also tacitly assumes that there is a systematic link between conditions taking place on a global scale and local effects. It is the utilization of the dependency of local climate on the global picture that is the backbone of downscaling; however, it is perhaps easiest to explain the concept of downscaling in climate research if we start asking why it is necessary.
Global climate models are our best tools for computing future temperature, wind, and precipitation (or other climatological variables), but their limitations do not let them calculate local details for these quantities. It is simply not adequate to interpolate from model results. However, the models are able to predict large-scale features, such as circulation patterns, El Niño Southern Oscillation (ENSO), and the global mean temperature. The local temperature and precipitation are nevertheless related to conditions taking place over a larger surrounding region as well as local geographical features (also true, in general, for variables connected to weather/climate). This, of course, also applies to other weather elements.
Downscaling makes use of systematic dependencies between local conditions and large-scale ambient phenomena in addition to including information about the effect of the local geography on the local climate. The application of downscaling can involve several different approaches. This article will discuss various downscaling strategies and methods and will elaborate on their rationale, assumptions, strengths, and weaknesses.
One important issue is the presence of spontaneous natural year-to-year variations that are not necessarily directly related to the global state, but are internally generated and superimposed on the long-term climate change. These variations typically involve phenomena such as ENSO, the North Atlantic Oscillation (NAO), and the Southeast Asian monsoon, which are nonlinear and non-deterministic.
We cannot predict the exact evolution of non-deterministic natural variations beyond a short time horizon. It is possible nevertheless to estimate probabilities for their future state based, for instance, on projections with models run many times with slightly different set-up, and thereby to get some information about the likelihood of future outcomes.
When it comes to downscaling and predicting regional and local climate, it is important to use many global climate model predictions. Another important point is to apply proper validation to make sure the models give skillful predictions.
For some downscaling approaches such as regional climate models, there usually is a need for bias adjustment due to model imperfections. This means the downscaling doesn’t get the right answer for the right reason. Some of the explanations for the presence of biases in the results may be different parameterization schemes in the driving global and the nested regional models.
A final underlying question is: What can we learn from downscaling? The context for the analysis is important, as downscaling is often used to find answers to some (implicit) question and can be a means of extracting most of the relevant information concerning the local climate. It is also important to include discussions about uncertainty, model skill or shortcomings, model validation, and skill scores.
Article
The environmental history of Venice over the last millennium has been reconstructed from written, pictorial, and architectural documentary sources, used in a synergistic way. The method of transforming a document into an index and then into calibrated numerical values according to an international system of units has been applied in the case of Venice and its geographical and climate peculiarities. Because frost constituted a dramatic challenge for the city, a series of severe winters is well documented: The city was sieged by ice, meaning Venetians had to cross the ice transporting food, beverages, and wood for burning in carts, as recorded in written reports and visual representations. The sea level in the 18th century has been reconstructed based on paintings by Canaletto and Bellotto, who took advantage of a camera obscura to precisely draw the views of the city and its canals.. These paintings accurately represent the green algae belt that corresponds to the level of soaking created by marine waters at high tide. This has made it possible to measure how much the green algae (and therefore the seawater) has risen since the 18th century. Similarly, a painting by Veronese has enabled the reconstruction of sea level rise (SLR) since 1571. Another useful proxy is the water stairs of the Venetian palaces. These were originally built to access boats and are now (almost) totally submerged and covered with algae. As the sea level rose, these steps became submerged underwater. The depth of the lowest step is therefore representative of how much the sea level rose after the stair was built. This proxy has allowed the relative sea level since 1350 to be reconstructed, and an exponential trend in the rising of the sea level has been identified. Venice has at times been flooded by seawater, including tsunamis at the beginning of the second millennium. A long series of sea floods due to storm surges triggered by particular meteorological situations shows that the flooding frequency is related to the exponential SLR. In the 1960s, there was a sharp increase in frequency of flooding, which coincided with the digging of deep and wide canals, excavated to allow the passage of tankers. This increased the exchange of water between the sea and the lagoon. Proxies based on archaeological remains, as well as geological-biological cores extracted from the coastal area and dated with isotopic methods, cover long time periods; the longest record reaching 13 ka BP. However, the time resolution is reduced, thus providing good data for physical geography purposes.
Article
Filippo Giorgi
Dynamical downscaling has been used for about 30 years to produce high-resolution climate information for studies of regional climate processes and for the production of climate information usable for vulnerability, impact assessment and adaptation studies. Three dynamical downscaling tools are available in the literature: high-resolution global atmospheric models (HIRGCMs), variable resolution global atmospheric models (VARGCMs), and regional climate models (RCMs). These techniques share their basic principles, but have different underlying assumptions, advantages and limitations. They have undergone a tremendous growth in the last decades, especially RCMs, to the point that they are considered fundamental tools in climate change research. Major intercomparison programs have been implemented over the years, culminating in the Coordinated Regional climate Downscaling EXperiment (CORDEX), an international program aimed at producing fine scale regional climate information based on multi-model and multi-technique approaches. These intercomparison projects have lead to an increasing understanding of fundamental issues in climate downscaling and in the potential of downscaling techniques to provide actionable climate change information. Yet some open issues remain, most notably that of the added value of downscaling, which are the focus of substantial current research. One of the primary future directions in dynamical downscaling is the development of fully coupled regional earth system models including multiple components, such as the atmosphere, the oceans, the biosphere and the chemosphere. Within this context, dynamical downscaling models offer optimal testbeds to incorporate the human component in a fully interactive way. Another main future research direction is the transition to models running at convection-permitting scales, order of 1–3 km, for climate applications. This is a major modeling step which will require substantial development in research and infrastructure, and will allow the description of local scale processes and phenomena within the climate change context. Especially in view of these future directions, climate downscaling will increasingly constitute a fundamental interface between the climate modeling and end-user communities in support of climate service activities.
Article
Christopher K. Wikle
The climate system consists of interactions between physical, biological, chemical, and human processes across a wide range of spatial and temporal scales. Characterizing the behavior of components of this system is crucial for scientists and decision makers. There is substantial uncertainty associated with observations of this system as well as our understanding of various system components and their interaction. Thus, inference and prediction in climate science should accommodate uncertainty in order to facilitate the decision-making process. Statistical science is designed to provide the tools to perform inference and prediction in the presence of uncertainty. In particular, the field of spatial statistics considers inference and prediction for uncertain processes that exhibit dependence in space and/or time. Traditionally, this is done descriptively through the characterization of the first two moments of the process, one expressing the mean structure and one accounting for dependence through covariability.
Historically, there are three primary areas of methodological development in spatial statistics: geostatistics, which considers processes that vary continuously over space; areal or lattice processes, which considers processes that are defined on a countable discrete domain (e.g., political units); and, spatial point patterns (or point processes), which consider the locations of events in space to be a random process. All of these methods have been used in the climate sciences, but the most prominent has been the geostatistical methodology. This methodology was simultaneously discovered in geology and in meteorology and provides a way to do optimal prediction (interpolation) in space and can facilitate parameter inference for spatial data. These methods rely strongly on Gaussian process theory, which is increasingly of interest in machine learning. These methods are common in the spatial statistics literature, but much development is still being done in the area to accommodate more complex processes and “big data” applications. Newer approaches are based on restricting models to neighbor-based representations or reformulating the random spatial process in terms of a basis expansion. There are many computational and flexibility advantages to these approaches, depending on the specific implementation. Complexity is also increasingly being accommodated through the use of the hierarchical modeling paradigm, which provides a probabilistically consistent way to decompose the data, process, and parameters corresponding to the spatial or spatio-temporal process.
Perhaps the biggest challenge in modern applications of spatial and spatio-temporal statistics is to develop methods that are flexible yet can account for the complex dependencies between and across processes, account for uncertainty in all aspects of the problem, and still be computationally tractable. These are daunting challenges, yet it is a very active area of research, and new solutions are constantly being developed. New methods are also being rapidly developed in the machine learning community, and these methods are increasingly more applicable to dependent processes. The interaction and cross-fertilization between the machine learning and spatial statistics community is growing, which will likely lead to a new generation of spatial statistical methods that are applicable to climate science.
Article
Douglas Maraun
Global climate models are our main tool to generate quantitative climate projections, but these models do not resolve the effects of complex topography, regional scale atmospheric processes and small-scale extreme events. To understand potential regional climatic changes, and to provide information for regional-scale impact modeling and adaptation planning, downscaling approaches have been developed. Regional climate change modeling, even though it is still a matter of basic research and questioned by many researchers, is urged to provide operational results. One major downscaling class is statistical downscaling, which exploits empirical relationships between larger-scale and local weather. The main statistical downscaling approaches are perfect prog (often referred to as empirical statistical downscaling), model output statistics (which is typically some sort of bias correction), and weather generators.
Statistical downscaling complements or adds to dynamical downscaling and is useful to generate user-tailored local-scale information, or to efficiently generate regional scale information about mean climatic changes from large global climate model ensembles. Further research is needed to assess to what extent the assumptions underlying statistical downscaling are met in typical applications, and to develop new methods for generating spatially coherent projections, and for including process-understanding in bias correction. The increasing resolution of global climate models will improve the representation of downscaling predictors and will, therefore, make downscaling an even more feasible approach that will still be required to tailor information for users.
Article
Benjamin Mark Sanderson
Long-term planning for many sectors of society—including infrastructure, human health, agriculture, food security, water supply, insurance, conflict, and migration—requires an assessment of the range of possible futures which the planet might experience. Unlike short-term forecasts for which validation data exists for comparing forecast to observation, long-term forecasts have almost no validation data. As a result, researchers must rely on supporting evidence to make their projections. A review of methods for quantifying the uncertainty of climate predictions is given. The primary tool for quantifying these uncertainties are climate models, which attempt to model all the relevant processes that are important in climate change. However, neither the construction nor calibration of climate models is perfect, and therefore the uncertainties due to model errors must also be taken into account in the uncertainty quantification.
Typically, prediction uncertainty is quantified by generating ensembles of solutions from climate models to span possible futures. For instance, initial condition uncertainty is quantified by generating an ensemble of initial states that are consistent with available observations and then integrating the climate model starting from each initial condition. A climate model is itself subject to uncertain choices in modeling certain physical processes. Some of these choices can be sampled using so-called perturbed physics ensembles, whereby uncertain parameters or structural switches are perturbed within a single climate model framework. For a variety of reasons, there is a strong reliance on so-called ensembles of opportunity, which are multi-model ensembles (MMEs) formed by collecting predictions from different climate modeling centers, each using a potentially different framework to represent relevant processes for climate change. The most extensive collection of these MMEs is associated with the Coupled Model Intercomparison Project (CMIP). However, the component models have biases, simplifications, and interdependencies that must be taken into account when making formal risk assessments. Techniques and concepts for integrating model projections in MMEs are reviewed, including differing paradigms of ensembles and how they relate to observations and reality. Aspects of these conceptual issues then inform the more practical matters of how to combine and weight model projections to best represent the uncertainties associated with projected climate change.