John T. Allen
The response of severe thunderstorms to a changing climate is a rapidly growing area of research. Severe thunderstorms are one of the largest contributors to global losses in excess of USD $10 billion per year in terms of property and agriculture, as well as dozens of fatalities. Phenomena associated with severe thunderstorms such as large hail (greater than 2 cm), damaging winds (greater than 90 kmh−1), and tornadoes pose a global threat, and have been documented on every continent except Antarctica. Limitations of observational records for assessing past trends have driven a variety of approaches to not only characterize the past occurrence but provide a baseline against which future projections can be interpreted. These proxy methods have included using environments or conditions favorable to the development of thunderstorms and directly simulating storm updrafts using dynamic downscaling. Both methodologies have demonstrated pronounced changes to the frequency of days producing severe thunderstorms. Major impacts of a strongly warmed climate include a general increase in the length of the season in both the fall and spring associated with increased thermal instability and increased frequency of severe days by the late 21st century. While earlier studies noted changes to vertical wind shear decreasing frequency, recent studies have illustrated that this change appears not to coincide with days which are unstable. Questions remain as to whether the likelihood of storm initiation decreases, whether all storms which now produce severe weather will maintain their physical structure in a warmer world, and how these changes to storm frequency and or intensity may manifest for each of the threats posed by tornadoes, hail, and damaging winds. Expansion of the existing understanding globally is identified as an area of needed future research, together with meaningful consideration of both the influence of climate variability and indirect implications of anthropogenic modification of the physical environment.
Susanne C. Moser
Communicating the impacts of climate change and possible adaptive responses is a relatively recent branch of the larger endeavor of climate change communication. This recent emergence, in large part, is driven by the fact that the impacts and policy/planning/practice responses have only recently emerged in more widespread public consciousness and discourse, and thus in scholarly treatment. This article will first describe the critical and precarious moment of when impacts and adaptation communication becomes important; it will then summarize proposed approaches to do so effectively; and discuss key challenges confronting climate change communication going forward. These challenges may well be unique in the field of communication, in that they either uniquely combine previously encountered difficulties into novel complexities or are truly unprecedented. To date, scholarship and experience in climate, environmental, or risk communication provide little guidance on how to meet these challenges of communicating effectively with diverse publics and decision makers in the face of long-term degradation of the life support system of humanity. The article will conclude with an attempt to offer research and practice directions, fit at least to serve as appropriately humble attitudes toward understanding and engaging fellow humans around the profound risks of an utterly uncertain and far-from-assured future.
What are the local consequences of a global climate change? This question is important for proper handling of risks associated with weather and climate. It also tacitly assumes that there is a systematic link between conditions taking place on a global scale and local effects. It is the utilization of the dependency of local climate on the global picture that is the backbone of downscaling; however, it is perhaps easiest to explain the concept of downscaling in climate research if we start asking why it is necessary.
Global climate models are our best tools for computing future temperature, wind, and precipitation (or other climatological variables), but their limitations do not let them calculate local details for these quantities. It is simply not adequate to interpolate from model results. However, the models are able to predict large-scale features, such as circulation patterns, El Niño Southern Oscillation (ENSO), and the global mean temperature. The local temperature and precipitation are nevertheless related to conditions taking place over a larger surrounding region as well as local geographical features (also true, in general, for variables connected to weather/climate). This, of course, also applies to other weather elements.
Downscaling makes use of systematic dependencies between local conditions and large-scale ambient phenomena in addition to including information about the effect of the local geography on the local climate. The application of downscaling can involve several different approaches. This article will discuss various downscaling strategies and methods and will elaborate on their rationale, assumptions, strengths, and weaknesses.
One important issue is the presence of spontaneous natural year-to-year variations that are not necessarily directly related to the global state, but are internally generated and superimposed on the long-term climate change. These variations typically involve phenomena such as ENSO, the North Atlantic Oscillation (NAO), and the Southeast Asian monsoon, which are nonlinear and non-deterministic.
We cannot predict the exact evolution of non-deterministic natural variations beyond a short time horizon. It is possible nevertheless to estimate probabilities for their future state based, for instance, on projections with models run many times with slightly different set-up, and thereby to get some information about the likelihood of future outcomes.
When it comes to downscaling and predicting regional and local climate, it is important to use many global climate model predictions. Another important point is to apply proper validation to make sure the models give skillful predictions.
For some downscaling approaches such as regional climate models, there usually is a need for bias adjustment due to model imperfections. This means the downscaling doesn’t get the right answer for the right reason. Some of the explanations for the presence of biases in the results may be different parameterization schemes in the driving global and the nested regional models.
A final underlying question is: What can we learn from downscaling? The context for the analysis is important, as downscaling is often used to find answers to some (implicit) question and can be a means of extracting most of the relevant information concerning the local climate. It is also important to include discussions about uncertainty, model skill or shortcomings, model validation, and skill scores.
Forecasting severe convective weather remains one of the most challenging tasks facing operational meteorology today, especially in the mid-latitudes, where severe convective storms occur most frequently and with the greatest impact. The forecast difficulties reflect, in part, the many different atmospheric processes of which severe thunderstorms are a by-product. These processes occur over a wide range of spatial and temporal scales, some of which are poorly understood and/or are inadequately sampled by observational networks. Therefore, anticipating the development and evolution of severe thunderstorms will likely remain an integral part of national and local forecasting efforts well into the future.
Modern severe weather forecasting began in the 1940s, primarily employing the pattern recognition approach throughout the 1950s and 1960s. Substantial changes in forecast approaches did not come until much later, however, beginning in the 1980s. By the start of the new millennium, significant advances in the understanding of the physical mechanisms responsible for severe weather enabled forecasts of greater spatial and temporal detail. At the same time, technological advances made available model thermodynamic and wind profiles that supported probabilistic forecasts of severe weather threats.
This article provides an updated overview of operational severe local storm forecasting, with emphasis on present-day understanding of the mesoscale processes responsible for severe convective storms, and the application of recent technological developments that have revolutionized some aspects of severe weather forecasting. The presentation, nevertheless, notes that increased understanding and enhanced computer sophistication are not a substitute for careful diagnosis of the current meteorological environment and an ingredients-based approach to anticipating changes in that environment; these techniques remain foundational to successful forecasts of tornadoes, large hail, damaging wind, and flash flooding.
R. J. Trapp
Cumulus clouds are pervasive on earth, and play important roles in the transfer of energy through the atmosphere. Under certain conditions, shallow, nonprecipitating cumuli may grow vertically to occupy a significant depth of the troposphere, and subsequently may evolve into convective storms.
The qualifier “convective” implies that the storms have vertical accelerations that are driven primarily, though not exclusively, by buoyancy over a deep layer. Such buoyancy in the atmosphere arises from local density variations relative to some base state density; the base state is typically idealized as a horizontal average over a large area, which is also considered the environment. Quantifications of atmospheric buoyancy are typically expressed in terms of temperature and humidity, and allow for an assessment of the likelihood that convective clouds will form or initiate. Convection initiation is intimately linked to existence of a mechanism by which air is vertically lifted to realize this buoyancy and thus accelerations. Weather fronts and orography are the canonical lifting mechanisms.
As modulated by an ambient or environmental distribution of temperature, humidity, and wind, weather fronts also facilitate the transition of convective clouds into storms with locally heavy rain, lightning, and other possible hazards. For example, in an environment characterized by winds that are weak and change little with distance above the ground, the storms tend to be short lived and benign. The structure of the vertical drafts and other internal storm processes under weak wind shear—i.e., a small change in the horizontal wind over some vertical distance—are distinct relative to those when the environmental wind shear is strong. In particular, strong wind shear in combination with large buoyancy favors the development of squall lines and supercells, both of which are highly coherent storm types. Besides having durations that may exceed a few hours, both of these storm types tend to be particularly hazardous: squall lines are most apt to generate swaths of damaging “straight-line” winds, and supercells spawn the most intense tornadoes and are responsible for the largest hail. Methods used to predict convective-storm hazards capitalize on this knowledge of storm formation and development.
For several decades, the Sahelian countries have been facing continuing rainfall shortages, which, coupled with anthropogenic factors, have severely disrupted the great ecological balance, leading the area in an inexorable process of desertification and land degradation. The Sahel faces a persistent problem of climate change with high rainfall variability and frequent droughts, and this is one of the major drivers of population’s vulnerability in the region. Communities struggle against severe land degradation processes and live in an unprecedented loss of productivity that hampers their livelihoods and puts them among the populations in the world that are the most vulnerable to climatic change. In response to severe land degradation, 11 countries of the Sahel agreed to work together to address the policy, investment, and institutional barriers to establishing a land-restoration program that addresses climate change and land degradation. The program is called the Pan-Africa Initiative for the Great Green Wall (GGW). The initiative aims at helping to halt desertification and land degradation in the Sahelian zone, improving the lives and livelihoods of smallholder farmers and pastoralists in the area and helping its populations to develop effective adaptation strategies and responses through the use of tree-based development programs. To make the GGW initiative successful, member countries have established a coordinated and integrated effort from the government level to local scales and engaged with many stakeholders. Planning, decision-making, and actions on the ground is guided by participation and engagement, informed by policy-relevant knowledge to address the set of scalable land-restoration practices, and address drivers of land use change in various human-environmental contexts. In many countries, activities specific to achieving the GGW objectives have been initiated in the last five years.
Hail has been identified as the largest contributor to insured losses from thunderstorms globally, with losses costing the insurance industry billions of dollars each year. Yet, of all precipitation types, hail is probably subject to the largest uncertainties. Some might go so far as to argue that observing and forecasting hail is as difficult, if not more difficult, than is forecasting tornadoes. The reasons why hail is challenging are many and varied and reflected by the fact that hailstones display a wide variety of shapes, sizes and internal structures. There is also an important clue in this diversity—nature is telling us that hail can grow by following a wide variety of trajectories within thunderstorms, each having a unique set of conditions. It is because of this complexity that modeling hail growth and forecasting size is so challenging. Consequently, it is understandable that predicting the occurrence and size of hail seems an impossible task.
Through persistence, ingenuity and technology, scientists have made progress in understanding the key ingredients and processes at play. Technological advances mean that we can now, with some confidence, identify those storms that very likely contain hail and even estimate the maximum expected hail size on the ground hours in advance. Even so, there is still much we need to learn about the many intriguing aspects of hail growth.
Since the dawn of the digital computing age in the mid-20th century, computers have been used as virtual laboratories for the study of atmospheric phenomena. The first simulations of thunderstorms captured only their gross features, yet required the most advanced computing hardware of the time. The following decades saw exponential growth in computational power that was, and continues to be, exploited by scientists seeking to answer fundamental questions about the internal workings of thunderstorms, the most devastating of which cause substantial loss of life and property throughout the world every year.
By the mid-1970s, the most powerful computers available to scientists contained, for the first time, enough memory and computing power to represent the atmosphere containing a thunderstorm in three dimensions. Prior to this time, thunderstorms were represented primarily in two dimensions, which implicitly assumed an infinitely long cloud in the missing dimension. These earliest state-of-the-art, fully three-dimensional simulations revealed fundamental properties of thunderstorms, such as the structure of updrafts and downdrafts and the evolution of precipitation, while still only roughly approximating the flow of an actual storm due computing limitations.
In the decades that followed these pioneering three-dimensional thunderstorm simulations, new modeling approaches were developed that included more accurate ways of representing winds, temperature, pressure, friction, and the complex microphysical processes involving solid, liquid, and gaseous forms of water within the storm. Further, these models also were able to be run at a resolution higher than that of previous studies due to the steady growth of available computational resources described by Moore’s law, which observed that computing power doubled roughly every two years. The resolution of thunderstorm models was able to be increased to the point where features on the order of a couple hundred meters could be resolved, allowing small but intense features such as downbursts and tornadoes to be simulated within the parent thunderstorm. As model resolution increased further, so did the amount of data produced by the models, which presented a significant challenge to scientists trying to compare their simulated thunderstorms to observed thunderstorms. Visualization and analysis software was developed and refined in tandem with improved modeling and computing hardware, allowing the simulated data to be brought to life and allowing direct comparison to observed storms. In 2019, the highest resolution simulations of violent thunderstorms are able to capture processes such as tornado formation and evolution which are found to include the aggregation of many small, weak vortices with diameters of dozens of meters, features which simply cannot not be simulated at lower resolution.
Saji N. Hameed
Discovered at the very end of the 20th century, the Indian Ocean Dipole (IOD) is a mode of natural climate variability that arises out of coupled ocean–atmosphere interaction in the Indian Ocean. It is associated with some of the largest changes of ocean–atmosphere state over the equatorial Indian Ocean on interannual time scales. IOD variability is prominent during the boreal summer and fall seasons, with its maximum intensity developing at the end of the boreal-fall season. Between the peaks of its negative and positive phases, IOD manifests a markedly zonal see-saw in anomalous sea surface temperature (SST) and rainfall—leading, in its positive phase, to a pronounced cooling of the eastern equatorial Indian Ocean, and a moderate warming of the western and central equatorial Indian Ocean; this is accompanied by deficit rainfall over the eastern Indian Ocean and surplus rainfall over the western Indian Ocean. Changes in midtropospheric heating accompanying the rainfall anomalies drive wind anomalies that anomalously lift the thermocline in the equatorial eastern Indian Ocean and anomalously deepen them in the central Indian Ocean. The thermocline anomalies further modulate coastal and open-ocean upwelling, thereby influencing biological productivity and fish catches across the Indian Ocean. The hydrometeorological anomalies that accompany IOD exacerbate forest fires in Indonesia and Australia and bring floods and infectious diseases to equatorial East Africa. The coupled ocean–atmosphere instability that is responsible for generating and sustaining IOD develops on a mean state that is strongly modulated by the seasonal cycle of the Austral-Asian monsoon; this setting gives the IOD its unique character and dynamics, including a strong phase-lock to the seasonal cycle. While IOD operates independently of the El Niño and Southern Oscillation (ENSO), the proximity between the Indian and Pacific Oceans, and the existence of oceanic and atmospheric pathways, facilitate mutual interactions between these tropical climate modes.