You are looking at 1-10 of 208 articles for:Clear All
Vienna was a metropolis in the middle of the Danube monarchy of Austria-Hungary and under the rule (1848–1916) of Emperor Franz Joseph I (1830–1916) the city experienced rapid growth and an unprecedented flowering of culture, the arts, architecture and science. The capital of the monarchy, an intellectual melting pot, was a city of distinguished personalities who formed the Second Viennese School of music, the Austrian School of economic thought and many more doctrines, including the ideas of Sigmund Freud, the founder of psychoanalysis. Vienna clearly reflected the zeitgeist of the fin de siècle in its economic, scientific, and cultural heyday.
At the end of the 19th century, meteorology and climatology became recognized scientific disciplines, and dynamical meteorology developed during the first quarter of the 20th century. The fact that imperial Austria took a leading position in these developments mostly owes to the work of renowned scientists of the Central Institute for Meteorology and Geodynamics (Zentralanstalt für Meteorologie und Geodynamik, ZAMG) in Vienna.
The institute was founded in 1851, and the astronomer Karl Kreil (1798–1862) became the first director. One of Kreil’s goals was to ensure that both the central meteorological station and the growing number of new meteorological stations across the entire territory of the Austrian Empire were equipped with all the appropriate instruments. Another important goal was the processing of the existing observations to publish in the institute’s yearbooks. In truth, that was the starting signal for all further scientific developments, including that of the Viennese School of Climatology.
During the first decade of the 1900s, Julius Hann (1839–1921), the third director of the ZAMG, was already being acknowledged as a renowned meteorologist and climatologist. He was a pioneer in gathering and synthesizing global climatological and meteorological data, and his Handbook of Climatology (Handbuch der Klimatologie; Hann, 1883 [Hann, J. (1883). Handbuch der Klimatologie. Stuttgart, Germany: J. Engelhorn]) and Textbook of Meteorology (Hann, 1901 [Hann, J. (1901). Lehrbuch der Meteorologie. Leipzig, Germany: C. H. Tauchnitz]) were standard setters (Davies, 2001 [Davies, H. C. (2001). Vienna and the founding of dynamical meteorology. In C. Hammerl, W. Lenhardt, R. Steinacker, & P. Steinhauser (Eds.), Die Zentralanstalt für Meteorologie und Geodynamik 1851–2001: 150 Jahre Meteorologie und Geophysik in Österreich (pp. 301–312). Graz, Austria: Leykam Buchverlagsgesellschaft]). In Hann’s era, one began to speak of a “Viennese or Austrian school.” Heinrich Ficker, who later became director of the institute, defined its distinguishing characteristic as a school that did not simply adhere to one direction but promoted each direction, every peculiar talent, and the ideas that a meteorologist with necessary characteristics was always present at key turning points in meteorological research.
Climate and simulation have become interwoven concepts during the past decades because, on the one hand, climate scientists shouldn’t experiment with real climate and, on the other hand, societies want to know how climate will change in the next decades. Both in-silico experiments for a better understanding of climatic processes as well as forecasts of possible futures can be achieved only by using climate models. The article investigates possibilities and problems of model-mediated knowledge for science and societies. It explores historically how climate became a subject of science and of simulation, what kind of infrastructure is required to apply models and simulations properly, and how model-mediated knowledge can be evaluated. In addition to an overview of the diversity and variety of models in climate science, the article focuses on quasiheuristic climate models, with an emphasis on atmospheric models.
Ricardo García-Herrera and David Barriopedro
The Mediterranean is a semi-enclosed sea surrounded by Europe to the north, Asia to the east, and Africa to the south. It covers an area of approximately 2.5 million km2, between 30–46 °N latitude and 6 °W and 36 °E longitude. The term Mediterranean climate is applied beyond the Mediterranean region itself and has been used since the early 20th century to classify other regions of the world, such as California or South Africa, usually located in the 30º–40º latitudinal band. The Mediterranean climate can be broadly characterized by warm to hot dry summers and mild wet winters. However, this broad picture hides important variations, which can be explained through the existence of two geographical gradients: North/South, with a warmer and drier south, and West/East, more influenced by Atlantic/Asian circulation.
The region is located at a crossroad between the mid-latitudes and the subtropical regimes. Thus, small changes in the Atlantic storm track may lead to dramatic changes in the precipitation of the northwestern area of the basin. The variability of the descending northern branch of the Hadley cell influences the climate of the southern margin, while the eastern border climate is conditioned by the Siberian High in winter and the Indian Summer Monsoon during summer. All these large-scale factors are modulated by the complex orography of the region, the contrasting albedo, and the moisture and heat supplied by the Mediterranean Sea. The interactions occurring among all these factors lead to a complex picture with some relevant phenomena characteristic of the Mediterranean region, such as heatwaves and droughts, Saharan dust intrusions, or specific types of cyclogenesis.
Climate model projections generally agree in characterizing the region as a climate change hotspot, considering that it is one of the areas of the globe likely to suffer pronounced climate changes. Anthropogenic influences are not new, since the region is densely populated and is the home of some the oldest civilizations on Earth. This has produced multiple and continuous modifications in the land cover, with measurable impacts on climate that can be traced from the rich available documentary evidence and high-resolution natural proxies.
Hail has been identified as the largest contributor to insured losses from thunderstorms globally, with losses costing the insurance industry billions of dollars each year. Yet, of all precipitation types, hail is probably subject to the largest uncertainties. Some might go so far as to argue that observing and forecasting hail is as difficult, if not more difficult, than is forecasting tornadoes. The reasons why hail is challenging are many and varied and reflected by the fact that hailstones display a wide variety of shapes, sizes and internal structures. There is also an important clue in this diversity—nature is telling us that hail can grow by following a wide variety of trajectories within thunderstorms, each having a unique set of conditions. It is because of this complexity that modeling hail growth and forecasting size is so challenging. Consequently, it is understandable that predicting the occurrence and size of hail seems an impossible task.
Through persistence, ingenuity and technology, scientists have made progress in understanding the key ingredients and processes at play. Technological advances mean that we can now, with some confidence, identify those storms that very likely contain hail and even estimate the maximum expected hail size on the ground hours in advance. Even so, there is still much we need to learn about the many intriguing aspects of hail growth.
Fedor Mesinger, Miodrag Rančić, and R. James Purser
The astonishing development of computer technology since the mid-20th century has been accompanied by a corresponding proliferation in the numerical methods that have been developed to improve the simulation of atmospheric flows. This article reviews some of the numerical developments concern the ongoing improvements of weather forecasting and climate simulation models. Early computers were single-processor machines with severely limited memory capacity and computational speed, requiring simplified representations of the atmospheric equations and low resolution. As the hardware evolved and memory and speed increased, it became feasible to accommodate more complete representations of the dynamic and physical atmospheric processes. These more faithful representations of the so-called primitive equations included dynamic modes that are not necessarily of meteorological significance, which in turn led to additional computational challenges. Understanding which problems required attention and how they should be addressed was not a straightforward and unique process, and it resulted in the variety of approaches that are summarized in this article. At about the turn of the century, the most dramatic developments in hardware were the inauguration of the era of massively parallel computers, together with the vast increase in the amount of rapidly accessible memory that the new architectures provided. These advances and opportunities have demanded a thorough reassessment of the numerical methods that are most successfully adapted to this new computational environment. This article combines a survey of the important historical landmarks together with a somewhat speculative review of methods that, at the time of writing, seem to hold out the promise of further advancing the art and science of atmospheric numerical modeling.
Benjamin Mark Sanderson
Long-term planning for many sectors of society—including infrastructure, human health, agriculture, food security, water supply, insurance, conflict, and migration—requires an assessment of the range of possible futures which the planet might experience. Unlike short-term forecasts for which validation data exists for comparing forecast to observation, long-term forecasts have almost no validation data. As a result, researchers must rely on supporting evidence to make their projections. A review of methods for quantifying the uncertainty of climate predictions is given. The primary tool for quantifying these uncertainties are climate models, which attempt to model all the relevant processes that are important in climate change. However, neither the construction nor calibration of climate models is perfect, and therefore the uncertainties due to model errors must also be taken into account in the uncertainty quantification.
Typically, prediction uncertainty is quantified by generating ensembles of solutions from climate models to span possible futures. For instance, initial condition uncertainty is quantified by generating an ensemble of initial states that are consistent with available observations and then integrating the climate model starting from each initial condition. A climate model is itself subject to uncertain choices in modeling certain physical processes. Some of these choices can be sampled using so-called perturbed physics ensembles, whereby uncertain parameters or structural switches are perturbed within a single climate model framework. For a variety of reasons, there is a strong reliance on so-called ensembles of opportunity, which are multi-model ensembles (MMEs) formed by collecting predictions from different climate modeling centers, each using a potentially different framework to represent relevant processes for climate change. The most extensive collection of these MMEs is associated with the Coupled Model Intercomparison Project (CMIP). However, the component models have biases, simplifications, and interdependencies that must be taken into account when making formal risk assessments. Techniques and concepts for integrating model projections in MMEs are reviewed, including differing paradigms of ensembles and how they relate to observations and reality. Aspects of these conceptual issues then inform the more practical matters of how to combine and weight model projections to best represent the uncertainties associated with projected climate change.
Sharon E. Nicholson
This article provides an in-depth look at all aspects of the climate of the Sahel, including the pervasive dust in the Sahelian atmosphere. Emphasis is on two aspects: West African monsoon and the region’s rainfall regime. This includes an overview of the prevailing atmospheric circulation at the surface and aloft and the relationship between this and the rainfall regime. Aspects of the rainfall regime that are considered include its unique characteristics, its changes over time, the storm systems that produce rainfall, and factors governing its variability on interannual and decadal time scales. Variability is examined on three time scales: millennial (as seen is the paleo records of the last 20,000 years), multi-decadal (as seen over the last few centuries as seen from proxy data and, more recently, in observations), and interannual to decadal (quantified by observations from the late 19th century and onward). A unique feature of Sahel climate is that is rainfall regime is perhaps the most sensitive in the world and this sensitivity is apparent on all of these time scales.
Florian Sévellec and Bablu Sinha
The Atlantic meridional overturning circulation (AMOC) is a large, basin-scale circulation located in the Atlantic Ocean that transports climatically important quantities of heat northward. It can be described schematically as a northward flow in the warm upper ocean and a southward return flow at depth in much colder water. The heat capacity of a layer of 2 m of seawater is equivalent to that of the entire atmosphere; therefore, ocean heat content dominates Earth’s energy storage. For this reason and because of the AMOC’s typically slow decadal variations, the AMOC regulates North Atlantic climate and contributes to the relatively mild climate of Europe. Hence, predicting AMOC variations is crucial for predicting climate variations in regions bordering the North Atlantic. Similar to weather predictions, climate predictions are based on numerical simulations of the climate system. However, providing accurate predictions on such long timescales is far from straightforward. Even in a perfect model approach, where biases between numerical models and reality are ignored, the chaotic nature of AMOC variability (i.e., high sensitivity to initial conditions) is a significant source of uncertainty, limiting its accurate prediction.
Predictability studies focus on factors determining our ability to predict the AMOC rather than actual predictions. To this end, processes affecting AMOC predictability can be separated into two categories: processes acting as a source of predictability (periodic harmonic oscillations, for instance) and processes acting as a source of uncertainty (small errors that grow and significantly modify the outcome of numerical simulations). To understand the former category, harmonic modes of variability or precursors of AMOC variations are identified. On the other hand, in a perfect model approach, the sources of uncertainty are characterized by the spread of numerical simulations differentiated by the application of small differences to their initial conditions. Two alternative and complementary frameworks have arisen to investigate this spread. The pragmatic framework corresponds to performing an ensemble of simulations, by imposing a randomly chosen small error on the initial conditions of individual simulations. This allows a probabilistic approach and to statistically characterize the importance of the initial condition by evaluating the spread of the ensemble. The theoretical framework uses stability analysis to identify small perturbations to the initial conditions, which are conducive to significant disruption of the AMOC.
Beyond these difficulties in assessing the predictability, decadal prediction systems have been developed and tested through a range of hindcasts. The inherent difficulties of operational forecasts span from developing efficient initialization methods to setting accurate radiative forcing to correcting for model drift and bias, all these improvements being estimated and validated through a range of specifically designed skill metrics.
Ole Bøssing Christensen and Erik Kjellström
The ecosystems and the societies of the Baltic Sea region are quite sensitive to fluctuations in climate, and therefore it is expected that anthropogenic climate change will affect the region considerably. With numerical climate models, a large amount of projections of meteorological variables affected by anthropogenic climate change have been performed in the Baltic Sea region for periods reaching the end of this century.
Existing global and regional climate model studies suggest that:
• The future Baltic climate will get warmer, mostly so in winter. Changes increase with time or increasing emissions of greenhouse gases. There is a large spread between different models, but they all project warming. In the northern part of the region, temperature change will be higher than the global average warming.
• Daily minimum temperatures will increase more than average temperature, particularly in winter.
• Future average precipitation amounts will be larger than today. The relative increase is largest in winter. In summer, increases in the far north and decreases in the south are seen in most simulations. In the intermediate region, the sign of change is uncertain.
• Precipitation extremes are expected to increase, though with a higher degree of uncertainty in magnitude compared to projected changes in temperature extremes.
• Future changes in wind speed are highly dependent on changes in the large-scale circulation simulated by global climate models (GCMs). The results do not all agree, and it is not possible to assess whether there will be a general increase or decrease in wind speed in the future.
• Only very small high-altitude mountain areas in a few simulations are projected to experience a reduction in winter snow amount of less than 50%. The southern half of the Baltic Sea region is projected to experience significant reductions in snow amount, with median reductions of around 75%.
John T. Allen
The response of severe thunderstorms to a changing climate is a rapidly growing area of research. Severe thunderstorms are one of the largest contributors to global losses in excess of USD $10 billion per year in terms of property and agriculture, as well as dozens of fatalities. Phenomena associated with severe thunderstorms such as large hail (greater than 2 cm), damaging winds (greater than 90 kmh−1), and tornadoes pose a global threat, and have been documented on every continent except Antarctica. Limitations of observational records for assessing past trends have driven a variety of approaches to not only characterize the past occurrence but provide a baseline against which future projections can be interpreted. These proxy methods have included using environments or conditions favorable to the development of thunderstorms and directly simulating storm updrafts using dynamic downscaling. Both methodologies have demonstrated pronounced changes to the frequency of days producing severe thunderstorms. Major impacts of a strongly warmed climate include a general increase in the length of the season in both the fall and spring associated with increased thermal instability and increased frequency of severe days by the late 21st century. While earlier studies noted changes to vertical wind shear decreasing frequency, recent studies have illustrated that this change appears not to coincide with days which are unstable. Questions remain as to whether the likelihood of storm initiation decreases, whether all storms which now produce severe weather will maintain their physical structure in a warmer world, and how these changes to storm frequency and or intensity may manifest for each of the threats posed by tornadoes, hail, and damaging winds. Expansion of the existing understanding globally is identified as an area of needed future research, together with meaningful consideration of both the influence of climate variability and indirect implications of anthropogenic modification of the physical environment.