Bjørn H. Samset
Among the factors that affect the climate, few are as diverse and challenging to understand as aerosols. Minute particles suspended in the atmosphere, aerosols are emitted through a wide range of natural and industrial processes, and are transported around the globe by winds and weather. Once airborne, they affect the climate both directly, through scattering and absorption of solar radiation, and indirectly, through their impact on cloud properties. Combining all their effects, anthropogenic changes to aerosol concentrations are estimated to have had a climate impact over the industrial era that is second only to CO2. Their atmospheric lifetime of only a few days, however, makes their climate effects substantially different from those of well-mixed greenhouse gases.
Major aerosol types include sea salt, dust, sulfate compounds, and black carbon—or soot—from incomplete combustion. Of these, most scatter incoming sunlight back to space, and thus mainly cool the climate. Black carbon, however, absorbs sunlight, and therefore acts as a heating agent much like a greenhouse gas. Furthermore, aerosols can act as cloud condensation nuclei, causing clouds to become whiter—and thus more reflecting—further cooling the surface. Black carbon is again a special case, acting to change the stability of the atmosphere through local heating of the upper air, and also changing the albedo of the surface when it is deposited on snow and ice, for example.
The wide range of climate interactions that aerosols have, and the fact that their distribution depends on the weather at the time and location of emission, lead to large uncertainties in the scientific assessment of their impact. This in turn leads to uncertainties in our present understanding of the climate sensitivity, because while aerosols have predominantly acted to oppose 20th-century global warming by greenhouse gases, the magnitude of aerosol effects on climate is highly uncertain.
Finally, aerosols are important for large-scale climate events such as major volcanoes, or the threat of nuclear winter. The relative ease with which they can be produced and distributed has led to suggestions for using targeted aerosol emissions to counteract global warming—so-called climate engineering.
Stefano Tibaldi and Franco Molteni
The atmospheric circulation in the mid-latitudes of both hemispheres is usually dominated by westerly winds and by planetary-scale and shorter-scale synoptic waves, moving mostly from west to east. A remarkable and frequent exception to this “usual” behavior is atmospheric blocking. Blocking occurs when the usual zonal flow is hindered by the establishment of a large-amplitude, quasi-stationary, high-pressure meridional circulation structure which “blocks” the flow of the westerlies and the progression of the atmospheric waves and disturbances embedded in them. Such blocking structures can have lifetimes varying from a few days to several weeks in the most extreme cases. Their presence can strongly affect the weather of large portions of the mid-latitudes, leading to the establishment of anomalous meteorological conditions. These can take the form of strong precipitation episodes or persistent anticyclonic regimes, leading in turn to floods, extreme cold spells, heat waves, or short-lived droughts. Even air quality can be strongly influenced by the establishment of atmospheric blocking, with episodes of high concentrations of low-level ozone in summer and of particulate matter and other air pollutants in winter, particularly in highly populated urban areas.
Atmospheric blocking has the tendency to occur more often in winter and in certain longitudinal quadrants, notably the Euro-Atlantic and the Pacific sectors of the Northern Hemisphere. In the Southern Hemisphere, blocking episodes are generally less frequent, and the longitudinal localization is less pronounced than in the Northern Hemisphere.
Blocking has aroused the interest of atmospheric scientists since the middle of the last century, with the pioneering observational works of Berggren, Bolin, Rossby, and Rex, and has become the subject of innumerable observational and theoretical studies. The purpose of such studies was originally to find a commonly accepted structural and phenomenological definition of atmospheric blocking. The investigations went on to study blocking climatology in terms of the geographical distribution of its frequency of occurrence and the associated seasonal and inter-annual variability. Well into the second half of the 20th century, a large number of theoretical dynamic works on blocking formation and maintenance started appearing in the literature. Such theoretical studies explored a wide range of possible dynamic mechanisms, including large-amplitude planetary-scale wave dynamics, including Rossby wave breaking, multiple equilibria circulation regimes, large-scale forcing of anticyclones by synoptic-scale eddies, finite-amplitude non-linear instability theory, and influence of sea surface temperature anomalies, to name but a few. However, to date no unique theoretical model of atmospheric blocking has been formulated that can account for all of its observational characteristics.
When numerical, global short- and medium-range weather predictions started being produced operationally, and with the establishment, in the late 1970s and early 1980s, of the European Centre for Medium-Range Weather Forecasts, it quickly became of relevance to assess the capability of numerical models to predict blocking with the correct space-time characteristics (e.g., location, time of onset, life span, and decay). Early studies showed that models had difficulties in correctly representing blocking as well as in connection with their large systematic (mean) errors.
Despite enormous improvements in the ability of numerical models to represent atmospheric dynamics, blocking remains a challenge for global weather prediction and climate simulation models. Such modeling deficiencies have negative consequences not only for our ability to represent the observed climate but also for the possibility of producing high-quality seasonal-to-decadal predictions. For such predictions, representing the correct space-time statistics of blocking occurrence is, especially for certain geographical areas, extremely important.
Climate and simulation have become interwoven concepts during the past decades because, on the one hand, climate scientists shouldn’t experiment with real climate and, on the other hand, societies want to know how climate will change in the next decades. Both in-silico experiments for a better understanding of climatic processes as well as forecasts of possible futures can be achieved only by using climate models. The article investigates possibilities and problems of model-mediated knowledge for science and societies. It explores historically how climate became a subject of science and of simulation, what kind of infrastructure is required to apply models and simulations properly, and how model-mediated knowledge can be evaluated. In addition to an overview of the diversity and variety of models in climate science, the article focuses on quasiheuristic climate models, with an emphasis on atmospheric models.
Climate and carbon cycle are tightly coupled on many time scales, from the interannual to the multimillennial. Observation always shows a positive feedback between climate and the carbon cycle: elevated atmospheric CO2 leads to warming, but warming is expected to further release of carbon to the atmosphere, enhancing the atmospheric CO2 increase. Earth system models do represent these climate–carbon cycle feedbacks, always simulating a positive feedback over the 21st century; that is, climate change will lead to loss of carbon from the land and ocean reservoirs. These processes partially offset the increases in land and ocean carbon sinks caused by rising atmospheric CO2. As a result, more of the emitted anthropogenic CO2 will remain in the atmosphere. There is, however, a large uncertainty on the magnitude of this feedback. Recent studies now help to reduce this uncertainty. On short, interannual, time scales, El Niño years record larger-than-average atmospheric CO2 growth rate, with tropical land ecosystems being the main drivers. These climate–carbon cycle anomalies can be used as emerging constraint on the tropical land carbon response to future climate change. On a longer, centennial, time scale, the variability of atmospheric CO2 found in records of the last millennium can be used to constrain the overall global carbon cycle response to climate. These independent methods confirm that the climate–carbon cycle feedback is positive, but probably more consistent with the lower end of the comprehensive models range, excluding very large climate–carbon cycle feedbacks.
Kerry H. Cook
Accurate projections of climate change under increasing atmospheric greenhouse gas levels are needed to evaluate the environmental cost of anthropogenic emissions, and to guide mitigation efforts. These projections are nowhere more important than Africa, with its high dependence on rain-fed agriculture and, in many regions, limited resources for adaptation. Climate models provide our best method for climate prediction but there are uncertainties in projections, especially on regional space scale. In Africa, limitations of observational networks add to this uncertainty since a crucial step in improving model projections is comparisons with observations. Exceeding uncertainties associated with climate model simulation are uncertainties due to projections of future emissions of CO2 and other greenhouse gases. Humanity’s choices in emissions pathways will have profound effects on climate, especially after the mid-century.
The African Sahel is a transition zone characterized by strong meridional precipitation and temperature gradients. Over West Africa, the Sahel marks the northernmost extent of the West African monsoon system. The region’s climate is known to be sensitive to sea surface temperatures, both regional and global, as well as to land surface conditions. Increasing atmospheric greenhouse gases are already causing amplified warming over the Sahara Desert and, consequently, increased rainfall in parts of the Sahel. Climate model projections indicate that much of this increased rainfall will be delivered in the form of more intense storm systems.
The complicated and highly regional precipitation regimes of East Africa present a challenge for climate modeling. Within roughly 5º of latitude of the equator, rainfall is delivered in two seasons—the long rains in the spring, and the short rains in the fall. Regional climate model projections suggest that the long rains will weaken under greenhouse gas forcing, and the short rains season will extend farther into the winter months. Observations indicate that the long rains are already weakening.
Changes in seasonal rainfall over parts of subtropical southern Africa are observed, with repercussions and challenges for agriculture and water availability. Some elements of these observed changes are captured in model simulations of greenhouse gas-induced climate change, especially an early demise of the rainy season. The projected changes are quite regional, however, and more high-resolution study is needed. In addition, there has been very limited study of climate change in the Congo Basin and across northern Africa. Continued efforts to understand and predict climate using higher-resolution simulation must be sustained to better understand observed and projected changes in the physical processes that support African precipitation systems as well as the teleconnections that communicate remote forcings into the continent.
Storms are characterized by high wind speeds; often large precipitation amounts in the form of rain, freezing rain, or snow; and thunder and lightning (for thunderstorms). Many different types exist, ranging from tropical cyclones and large storms of the midlatitudes to small polar lows, Medicanes, thunderstorms, or tornadoes. They can lead to extreme weather events like storm surges, flooding, high snow quantities, or bush fires. Storms often pose a threat to human lives and property, agriculture, forestry, wildlife, ships, and offshore and onshore industries. Thus, it is vital to gain knowledge about changes in storm frequency and intensity. Future storm predictions are important, and they depend to a great extent on the evaluation of changes in wind statistics of the past.
To obtain reliable statistics, long and homogeneous time series over at least some decades are needed. However, wind measurements are frequently influenced by changes in the synoptic station, its location or surroundings, instruments, and measurement practices. These factors deteriorate the homogeneity of wind records. Storm indexes derived from measurements of sea-level pressure are less prone to such changes, as pressure does not show very much spatial variability as wind speed does. Long-term historical pressure measurements exist that enable us to deduce changes in storminess for more than the last 140 years. But storm records are not just compiled from measurement data; they also may be inferred from climate model data.
The first numerical weather forecasts were performed in the 1950s. These served as a basis for the development of atmospheric circulation models, which were the first generation of climate models or general-circulation models. Soon afterward, model data was analyzed for storm events and cyclone-tracking algorithms were programmed. Climate models nowadays have reached high resolution and reliability and can be run not just for the past, but also for future emission scenarios which return possible future storm activity.
William Joseph Gutowski and Filippo Giorgi
Regional climate downscaling has been motivated by the objective to understand how climate processes not resolved by global models can influence the evolution of a region’s climate and by the need to provide climate change information to other sectors, such as water resources, agriculture, and human health, on scales poorly resolved by global models but where impacts are felt. There are four primary approaches to regional downscaling: regional climate models (RCMs), empirical statistical downscaling (ESD), variable resolution global models (VARGCM), and “time-slice” simulations with high-resolution global atmospheric models (HIRGCM). Downscaling using RCMs is often referred to as dynamical downscaling to contrast it with statistical downscaling. Although there have been efforts to coordinate each of these approaches, the predominant effort to coordinate regional downscaling activities has involved RCMs.
Initially, downscaling activities were directed toward specific, individual projects. Typically, there was little similarity between these projects in terms of focus region, resolution, time period, boundary conditions, and phenomena of interest. The lack of coordination hindered evaluation of downscaling methods, because sources of success or problems in downscaling could be specific to model formulation, phenomena studied, or the method itself. This prompted the organization of the first dynamical-downscaling intercomparison projects in the 1990s and early 2000s. These programs and several others following provided coordination focused on an individual region and an opportunity to understand sources of differences between downscaling models while overall illustrating the capabilities of dynamical downscaling for representing climatologically important regional phenomena. However, coordination between programs was limited.
Recognition of the need for further coordination led to the formation of the Coordinated Regional Downscaling Experiment (CORDEX) under the auspices of the World Climate Research Programme (WCRP). Initial CORDEX efforts focused on establishing and performing a common framework for carrying out dynamically downscaled simulations over multiple regions around the world. This framework has now become an organizing structure for downscaling activities around the world. Further efforts under the CORDEX program have strengthened the program’s scientific motivations, such as assessing added value in downscaling, regional human influences on climate, coupled ocean–land–atmosphere modeling, precipitation systems, extreme events, and local wind systems. In addition, CORDEX is promoting expanded efforts to compare capabilities of all downscaling methods for producing regional information. The efforts are motivated in part by the scientific goal to understand thoroughly regional climate and its change and by the growing need for climate information to assist climate services for a multitude of climate-impacted sectors.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Climate Science. Please check back later for the full article.
Dynamical downscaling (DD) consists of the use of physical models to downscale the large-scale climate information produced by coupled Atmosphere-Ocean Global Climate Models (AOGCMs). This can be achieved with global high-resolution atmospheric GCMs (HIRGCMs), variable resolution GCMs (VARGCMs) and limited area Regional Climate Models (RCMs). Borrowing from numerical weather prediction, DD techniques originated in the late 1980s from the need to produce high-resolution regional climate information for application to impact studies. The philosophy behind DD is that the AOGCM can simulate the response of the global circulation to large-scale forcings (e.g., due to greenhouse gases) and the DD tools can regionally enhance this response to account for the contribution of fine-scale processes and forcings, for example, due to aerosols and complex topography, coastlines, and vegetation cover.
Since the 1990s the use of DD for climate studies, and principally RCMs, has grown tremendously, to the point that DD techniques, along with Empirical-Statistical Downscaling (ESD), are considered key elements in the production of climate information for regions. In fact, the use of DD is justified to the extent that it adds useful and robust high-resolution information to that produced by AOGCMs, and considerable research has gone into investigating this central issue, often referred to as “added value,” which is still often debated. Today a number of flexible and portable RCM systems are available, which can be routinely run for up to centennial-scale experiments over domains distributed worldwide for a wide range of applications, from process studies to paleo and future climate simulations. The model resolution has steadily increased up to grid spacings of ~10–25 km, and a new generation of non-hydrostatic RCMs is being developed and tested for use in very-high-resolution (~ few km) convection-permitting simulations. In addition, the development of coupled regional earth system models is a new frontier area of research aimed at exploring the importance of air-sea-land interactions at regional scales.
A fundamental step toward a better understanding of DD techniques has been the inception of multimodel intercomparison studies. These were originally regional in nature, which prevented the application of common protocols and thus hindered the transfer of know-how across projects. However, this problem was addressed through the creation in the late 2000s of the Coordinated Regional Climate Downscaling Experiment (CORDEX), which provided a common simulation protocol across regions worldwide, representing a fundamental growth step for the DD community.
Often different DD and ESD techniques have been seen in competition with each other, and with AOGCMs. However the realization is growing that they all represent complementary pieces to compose the puzzle of generating robust and credible climate services to address the needs and concerns of different regions, countries, and societal sectors. DD will continue to be increasingly used in the generation of actionable climate information, but a solid understanding of the advantages and limitations of DD is paramount to its use in this process.
Forecasting severe convective weather remains one of the most challenging tasks facing operational meteorology today, especially in the mid-latitudes, where severe convective storms occur most frequently and with the greatest impact. The forecast difficulties reflect, in part, the many different atmospheric processes of which severe thunderstorms are a by-product. These processes occur over a wide range of spatial and temporal scales, some of which are poorly understood and/or are inadequately sampled by observational networks. Therefore, anticipating the development and evolution of severe thunderstorms will likely remain an integral part of national and local forecasting efforts well into the future.
Modern severe weather forecasting began in the 1940s, primarily employing the pattern recognition approach throughout the 1950s and 1960s. Substantial changes in forecast approaches did not come until much later, however, beginning in the 1980s. By the start of the new millennium, significant advances in the understanding of the physical mechanisms responsible for severe weather enabled forecasts of greater spatial and temporal detail. At the same time, technological advances made available model thermodynamic and wind profiles that supported probabilistic forecasts of severe weather threats.
This article provides an updated overview of operational severe local storm forecasting, with emphasis on present-day understanding of the mesoscale processes responsible for severe convective storms, and the application of recent technological developments that have revolutionized some aspects of severe weather forecasting. The presentation, nevertheless, notes that increased understanding and enhanced computer sophistication are not a substitute for careful diagnosis of the current meteorological environment and an ingredients-based approach to anticipating changes in that environment; these techniques remain foundational to successful forecasts of tornadoes, large hail, damaging wind, and flash flooding.
R. J. Trapp
Cumulus clouds are pervasive on earth, and play important roles in the transfer of energy through the atmosphere. Under certain conditions, shallow, nonprecipitating cumuli may grow vertically to occupy a significant depth of the troposphere, and subsequently may evolve into convective storms.
The qualifier “convective” implies that the storms have vertical accelerations that are driven primarily, though not exclusively, by buoyancy over a deep layer. Such buoyancy in the atmosphere arises from local density variations relative to some base state density; the base state is typically idealized as a horizontal average over a large area, which is also considered the environment. Quantifications of atmospheric buoyancy are typically expressed in terms of temperature and humidity, and allow for an assessment of the likelihood that convective clouds will form or initiate. Convection initiation is intimately linked to existence of a mechanism by which air is vertically lifted to realize this buoyancy and thus accelerations. Weather fronts and orography are the canonical lifting mechanisms.
As modulated by an ambient or environmental distribution of temperature, humidity, and wind, weather fronts also facilitate the transition of convective clouds into storms with locally heavy rain, lightning, and other possible hazards. For example, in an environment characterized by winds that are weak and change little with distance above the ground, the storms tend to be short lived and benign. The structure of the vertical drafts and other internal storm processes under weak wind shear—i.e., a small change in the horizontal wind over some vertical distance—are distinct relative to those when the environmental wind shear is strong. In particular, strong wind shear in combination with large buoyancy favors the development of squall lines and supercells, both of which are highly coherent storm types. Besides having durations that may exceed a few hours, both of these storm types tend to be particularly hazardous: squall lines are most apt to generate swaths of damaging “straight-line” winds, and supercells spawn the most intense tornadoes and are responsible for the largest hail. Methods used to predict convective-storm hazards capitalize on this knowledge of storm formation and development.
Since the dawn of the digital computing age in the mid-20th century, computers have been used as virtual laboratories for the study of atmospheric phenomena. The first simulations of thunderstorms captured only their gross features, yet required the most advanced computing hardware of the time. The following decades saw exponential growth in computational power that was, and continues to be, exploited by scientists seeking to answer fundamental questions about the internal workings of thunderstorms, the most devastating of which cause substantial loss of life and property throughout the world every year.
By the mid-1970s, the most powerful computers available to scientists contained, for the first time, enough memory and computing power to represent the atmosphere containing a thunderstorm in three dimensions. Prior to this time, thunderstorms were represented primarily in two dimensions, which implicitly assumed an infinitely long cloud in the missing dimension. These earliest state-of-the-art, fully three-dimensional simulations revealed fundamental properties of thunderstorms, such as the structure of updrafts and downdrafts and the evolution of precipitation, while still only roughly approximating the flow of an actual storm due computing limitations.
In the decades that followed these pioneering three-dimensional thunderstorm simulations, new modeling approaches were developed that included more accurate ways of representing winds, temperature, pressure, friction, and the complex microphysical processes involving solid, liquid, and gaseous forms of water within the storm. Further, these models also were able to be run at a resolution higher than that of previous studies due to the steady growth of available computational resources described by Moore’s law, which observed that computing power doubled roughly every two years. The resolution of thunderstorm models was able to be increased to the point where features on the order of a couple hundred meters could be resolved, allowing small but intense features such as downbursts and tornadoes to be simulated within the parent thunderstorm. As model resolution increased further, so did the amount of data produced by the models, which presented a significant challenge to scientists trying to compare their simulated thunderstorms to observed thunderstorms. Visualization and analysis software was developed and refined in tandem with improved modeling and computing hardware, allowing the simulated data to be brought to life and allowing direct comparison to observed storms. In 2019, the highest resolution simulations of violent thunderstorms are able to capture processes such as tornado formation and evolution which are found to include the aggregation of many small, weak vortices with diameters of dozens of meters, features which simply cannot not be simulated at lower resolution.
A typhoon is a highly organized storm system that develops from initial cyclone eddies and matures by sucking up from the warm tropical oceans large quantities of water vapor that condense at higher altitudes. This latent heat of condensation is the prime source of energy supply that strengthens the typhoon as it progresses across the Pacific Ocean. A typhoon differs from other tropical cyclones only on the basis of location. While hurricanes form in the Atlantic Ocean and eastern North Pacific Ocean, typhoons develop in the western North Pacific around the Philippines, Japan, and China.
Because of their violent histories with strong winds and torrential rains and their impact on society, the countries that ring the North Pacific basin—China, Japan, Korea, the Philippines, and Taiwan—all often felt the need for producing typhoon forecasts and establishing storm warning services. Typhoon accounts in the pre-instrumental era were normally limited to descriptions of damage and incidences, and subsequent studies were hampered by the impossibility of solving the equations governing the weather, as they are distinctly nonlinear. The world’s first typhoon forecast was made in 1879 by Fr. Federico Faura, who was a Jesuit scientist from the Manila Observatory. His brethren from the Zikawei Jesuit Observatory, Fr. Marc Dechevrens, first reconstructed the trajectory of a typhoon in 1879, a study that marked the beginning of an era. The Jesuits and other Europeans like William Doberck studied typhoons as a research topic, and their achievements are regarded as products of colonial meteorology.
Between the First and Second World Wars, there were important contributions to typhoon science by meteorologists in the Philippines (Ch. Deppermann, M. Selga, and J. Coronas), China (E. Gherzi), and Japan (T. Okada, and Y. Horiguti). The polar front theory developed by the Bergen School in Norway played an important role in creating the large-scale setting for tropical cyclones. Deppermann became the greatest exponent of the polar front theory and air-masses analysis in the Far East and Southeast Asia.
From the end of WWII, it became evident that more effective typhoon forecasts were needed to meet military demands. In Hawaii, a joint Navy and Air Force center for typhoon analysis and forecasting was established in 1959—the Joint Typhoon Warning Center (JTWC). Its goals were to publish annual typhoon summaries and conduct research into tropical cyclone forecasting and detection. Other centers had previously specialized in issuing typhoon warnings and analysis. Thus, research and operational forecasting went hand in hand not only in the American JTWC but also in China (the Hong Kong Observatory, the Macao Meteorological and Geophysical Bureau), Japan (the Regional Specialized Meteorological Center), and the Philippines (Atmospheric, Geophysical and Astronomical Service Administration [PAGASA]). These efforts produced more precise scientific knowledge about the formation, structure, and movement of typhoons. In the 1970s and the 1980s, three new tools for research—three-dimensional numerical cloud models, Doppler radar, and geosynchronous satellite imagery—provided a new observational and dynamical perspective on tropical cyclones. The development of modern computing systems has offered the possibility of making numerical weather forecast models and simulations of tropical cyclones. However, typhoons are not mechanical artifacts, and forecasting their track and intensity remains an uncertain science.
The Sahel of Africa has been identified as having the strongest land–atmosphere (L/A) interactions on Earth. The Sahelian L/A interaction studies started in the late 1970s. However, due to controversies surrounding the early studies, in which only a single land parameter was considered in L/A interactions, the credibility of land-surface effects on the Sahel’s climate has long been challenged. Using general circulation models and regional climate models coupled with biogeophysical and dynamic vegetation models as well as applying analyses of satellite-derived data, field measurements, and assimilation data, the effects of land-surface processes on West African monsoon variability, which dominates the Sahel climate system at intraseasonal, seasonal, interannual, and decadal scales, as well as mesoscale, have been extensively investigated to realistically explore the Sahel L/A interaction: its effects and the mechanisms involved.
The Sahel suffered the longest and most severe drought on the planet in the 20th century. The devastating environmental and socioeconomic consequences resulting from drought-induced famines in the Sahel have provided strong motivation for the scientific community and society to understand the causes of the drought and its impact. It was controversial and under debate whether the drought was a natural process, mainly induced by sea-surface temperature variability, or was affected by anthropogenic activities. Diagnostic and modeling studies of the sea-surface temperature have consistently demonstrated it exerts great influence on the Sahel climate system, but sea-surface temperature is unable to explain the full scope of the Sahel climate variability and the later 20th century’s drought. The effect of land-surface processes, especially land-cover and land-use change, on the drought have also been extensively investigated. The results with more realistic land-surface models suggest land processes are a first-order contributor to the Sahel climate and to its drought during the later 1960s to the 1980s, comparable to sea surface temperature effects. The issues that caused controversies in the early studies have been properly addressed in the studies with state-of-the-art models and available data.
The mechanisms through which land processes affect the atmosphere are also elucidated in a number of studies. Land-surface processes not only affect vertical transfer of radiative fluxes and heat fluxes but also affect horizontal advections through their effect on the atmospheric heating rate and moisture flux convergence/divergence as well as horizontal temperature gradients.
Florian Sévellec and Bablu Sinha
The Atlantic meridional overturning circulation (AMOC) is a large, basin-scale circulation located in the Atlantic Ocean that transports climatically important quantities of heat northward. It can be described schematically as a northward flow in the warm upper ocean and a southward return flow at depth in much colder water. The heat capacity of a layer of 2 m of seawater is equivalent to that of the entire atmosphere; therefore, ocean heat content dominates Earth’s energy storage. For this reason and because of the AMOC’s typically slow decadal variations, the AMOC regulates North Atlantic climate and contributes to the relatively mild climate of Europe. Hence, predicting AMOC variations is crucial for predicting climate variations in regions bordering the North Atlantic. Similar to weather predictions, climate predictions are based on numerical simulations of the climate system. However, providing accurate predictions on such long timescales is far from straightforward. Even in a perfect model approach, where biases between numerical models and reality are ignored, the chaotic nature of AMOC variability (i.e., high sensitivity to initial conditions) is a significant source of uncertainty, limiting its accurate prediction.
Predictability studies focus on factors determining our ability to predict the AMOC rather than actual predictions. To this end, processes affecting AMOC predictability can be separated into two categories: processes acting as a source of predictability (periodic harmonic oscillations, for instance) and processes acting as a source of uncertainty (small errors that grow and significantly modify the outcome of numerical simulations). To understand the former category, harmonic modes of variability or precursors of AMOC variations are identified. On the other hand, in a perfect model approach, the sources of uncertainty are characterized by the spread of numerical simulations differentiated by the application of small differences to their initial conditions. Two alternative and complementary frameworks have arisen to investigate this spread. The pragmatic framework corresponds to performing an ensemble of simulations, by imposing a randomly chosen small error on the initial conditions of individual simulations. This allows a probabilistic approach and to statistically characterize the importance of the initial condition by evaluating the spread of the ensemble. The theoretical framework uses stability analysis to identify small perturbations to the initial conditions, which are conducive to significant disruption of the AMOC.
Beyond these difficulties in assessing the predictability, decadal prediction systems have been developed and tested through a range of hindcasts. The inherent difficulties of operational forecasts span from developing efficient initialization methods to setting accurate radiative forcing to correcting for model drift and bias, all these improvements being estimated and validated through a range of specifically designed skill metrics.
H.E. Markus Meier and Sofia Saraiva
In this article, the concepts and background of regional climate modeling of the future Baltic Sea are summarized and state-of-the-art projections, climate change impact studies, and challenges are discussed. The focus is on projected oceanographic changes in future climate. However, as these changes may have a significant impact on biogeochemical cycling, nutrient load scenario simulations in future climates are briefly discussed as well. The Baltic Sea is special compared to other coastal seas as it is a tideless, semi-enclosed sea with large freshwater and nutrient supply from a partly heavily populated catchment area and a long response time of about 30 years, and as it is, in the early 21st century, warming faster than any other coastal sea in the world. Hence, policymakers request the development of nutrient load abatement strategies in future climate. For this purpose, large ensembles of coupled climate–environmental scenario simulations based upon high-resolution circulation models were developed to estimate changes in water temperature, salinity, sea-ice cover, sea level, oxygen, nutrient, and phytoplankton concentrations, and water transparency, together with uncertainty ranges. Uncertainties in scenario simulations of the Baltic Sea are considerable. Sources of uncertainties are global and regional climate model biases, natural variability, and unknown greenhouse gas emission and nutrient load scenarios. Unknown early 21st-century and future bioavailable nutrient loads from land and atmosphere and the experimental setup of the dynamical downscaling technique are perhaps the largest sources of uncertainties for marine biogeochemistry projections. The high uncertainties might potentially be reducible through investments in new multi-model ensemble simulations that are built on better experimental setups, improved models, and more plausible nutrient loads. The development of community models for the Baltic Sea region with improved performance and common coordinated experiments of scenario simulations is recommended.
Regional models were originally developed to serve weather forecasting and regional process studies. Typical simulations encompass time periods in the order of days or weeks. Thereafter regional models were also used more and more as regional climate models for longer integrations and climate change downscaling. Regional climate modeling or regional dynamic downscaling, which are used interchangeably, developed as its own branch in climate research since the end of the 1990s out of the need to bridge the obvious inconsistencies at the interface of global climate research and climate impact research. The primary aim of regional downscaling is to provide consistent regional climate change scenarios with relevant spatial resolution to serve detailed climate impact assessments.
Similar to global climate modeling, the early attempts at regional climate modeling were based on uncoupled atmospheric models or stand-alone ocean models, an approach that is still maintained as the most common on the regional scale. However, this approach has some fundamental limitations, since regional air-sea interaction remains unresolved and regional feedbacks are neglected. This is crucial when assessing climate change impacts in the coastal zone or the regional marine environment. To overcome these limitations, regional climate modeling is currently in a transition from uncoupled regional models into coupled atmosphere-ocean models, leading to fully integrated earth system models. Coupled ice-ocean-atmosphere models have been developed during the last decade and are currently robust and well established on the regional scale. Their added value has been demonstrated for regional climate modeling in marine regions, and the importance of regional air-sea interaction became obvious. Coupled atmosphere-ice-ocean models, but also coupled physical-biogeochemical modeling approaches are increasingly used for the marine realm. First attempts to couple these two approaches together with land surface models are underway. Physical coupled atmosphere-ocean modeling is also developing further and first model configurations resolving wave effects at the atmosphere-ocean interface are now available. These new developments now open up for improved regional assessment under broad consideration of local feedbacks and interactions between the regional atmosphere, cryosphere, hydrosphere, and biosphere.
Anjuli S. Bamzai
In the years following the Second World War, the U.S. government played a prominent role in the support of basic scientific research. The National Science Foundation (NSF) was created in 1950 with the primary mission of supporting fundamental science and engineering, excluding medical sciences. Over the years, the NSF has operated from the “bottom up,” keeping close track of research around the United States and the world while maintaining constant contact with the research community to identify ever-moving horizons of inquiry.
In the 1950s the field of meteorology was something of a poor cousin to the other branches of science; forecasting was considered more of trade than a discipline founded on sound theoretical foundations. Realizing the importance of the field to both the economy and national security, the NSF leadership made a concerted effort to enhance understanding of the global atmospheric circulation. The National Center for Atmospheric Research (NCAR) was established to complement ongoing research efforts in academic institutions; it has played a pivotal role in providing observational and modeling tools to the emerging cadre of researchers in the disciplines of meteorology and atmospheric sciences. As understanding of the predictability of the coupled atmosphere-ocean system grew, the field of climate science emerged as a natural outgrowth of meteorology, oceanography, and atmospheric sciences.
The NSF played a leading role in the implementation of major international programs such as the International Geophysical Year (IGY), the Global Weather Experiment, the World Ocean Circulation Experiment (WOCE) and Tropical Ocean Global Atmosphere (TOGA). Through these programs, understanding of the coupled climate system comprising atmosphere, ocean, land, ice-sheet, and sea ice greatly improved. Consistent with its mission, the NSF supported projects that advanced fundamental knowledge of forcing and feedbacks in the coupled atmosphere-ocean-land system. Research projects have included theoretical, observational, and modeling studies of the following: the general circulation of the stratosphere and troposphere; the processes that govern climate; the causes of climate variability and change; methods of predicting climate variations; climate predictability; development and testing of parameterization of physical processes; numerical methods for use in large-scale climate models; the assembly and analysis of instrumental and/or modeled climate data; data assimilation studies; and the development and use of climate models to diagnose and simulate climate variability and change.
Climate scientists work together on an array of topics spanning time scales from the seasonal to the centennial. The NSF also supports research on the natural evolution of the earth’s climate on geological time scales with the goal of providing a baseline for present variability and future trends. The development of paleoclimate data sets has resulted in longer term data for evaluation of model simulations, analogous to the evaluation using instrumental observations. This has enabled scientists to create transformative syntheses of paleoclimate data and modeling outcomes in order to understand the response of the longer-term and higher magnitude variability of the climate system that is observed in the geological records.
The NSF will continue to address emerging issues in climate and earth-system science through balanced investments in transformative ideas, enabling infrastructure and major facilities to be developed.
Benjamin Mark Sanderson
Long-term planning for many sectors of society—including infrastructure, human health, agriculture, food security, water supply, insurance, conflict, and migration—requires an assessment of the range of possible futures which the planet might experience. Unlike short-term forecasts for which validation data exists for comparing forecast to observation, long-term forecasts have almost no validation data. As a result, researchers must rely on supporting evidence to make their projections. A review of methods for quantifying the uncertainty of climate predictions is given. The primary tool for quantifying these uncertainties are climate models, which attempt to model all the relevant processes that are important in climate change. However, neither the construction nor calibration of climate models is perfect, and therefore the uncertainties due to model errors must also be taken into account in the uncertainty quantification.
Typically, prediction uncertainty is quantified by generating ensembles of solutions from climate models to span possible futures. For instance, initial condition uncertainty is quantified by generating an ensemble of initial states that are consistent with available observations and then integrating the climate model starting from each initial condition. A climate model is itself subject to uncertain choices in modeling certain physical processes. Some of these choices can be sampled using so-called perturbed physics ensembles, whereby uncertain parameters or structural switches are perturbed within a single climate model framework. For a variety of reasons, there is a strong reliance on so-called ensembles of opportunity, which are multi-model ensembles (MMEs) formed by collecting predictions from different climate modeling centers, each using a potentially different framework to represent relevant processes for climate change. The most extensive collection of these MMEs is associated with the Coupled Model Intercomparison Project (CMIP). However, the component models have biases, simplifications, and interdependencies that must be taken into account when making formal risk assessments. Techniques and concepts for integrating model projections in MMEs are reviewed, including differing paradigms of ensembles and how they relate to observations and reality. Aspects of these conceptual issues then inform the more practical matters of how to combine and weight model projections to best represent the uncertainties associated with projected climate change.