William Joseph Gutowski and Filippo Giorgi
Regional climate downscaling has been motivated by the objective to understand how climate processes not resolved by global models can influence the evolution of a region’s climate and by the need to provide climate change information to other sectors, such as water resources, agriculture, and human health, on scales poorly resolved by global models but where impacts are felt. There are four primary approaches to regional downscaling: regional climate models (RCMs), empirical statistical downscaling (ESD), variable resolution global models (VARGCM), and “time-slice” simulations with high-resolution global atmospheric models (HIRGCM). Downscaling using RCMs is often referred to as dynamical downscaling to contrast it with statistical downscaling. Although there have been efforts to coordinate each of these approaches, the predominant effort to coordinate regional downscaling activities has involved RCMs.
Initially, downscaling activities were directed toward specific, individual projects. Typically, there was little similarity between these projects in terms of focus region, resolution, time period, boundary conditions, and phenomena of interest. The lack of coordination hindered evaluation of downscaling methods, because sources of success or problems in downscaling could be specific to model formulation, phenomena studied, or the method itself. This prompted the organization of the first dynamical-downscaling intercomparison projects in the 1990s and early 2000s. These programs and several others following provided coordination focused on an individual region and an opportunity to understand sources of differences between downscaling models while overall illustrating the capabilities of dynamical downscaling for representing climatologically important regional phenomena. However, coordination between programs was limited.
Recognition of the need for further coordination led to the formation of the Coordinated Regional Downscaling Experiment (CORDEX) under the auspices of the World Climate Research Programme (WCRP). Initial CORDEX efforts focused on establishing and performing a common framework for carrying out dynamically downscaled simulations over multiple regions around the world. This framework has now become an organizing structure for downscaling activities around the world. Further efforts under the CORDEX program have strengthened the program’s scientific motivations, such as assessing added value in downscaling, regional human influences on climate, coupled ocean–land–atmosphere modeling, precipitation systems, extreme events, and local wind systems. In addition, CORDEX is promoting expanded efforts to compare capabilities of all downscaling methods for producing regional information. The efforts are motivated in part by the scientific goal to understand thoroughly regional climate and its change and by the growing need for climate information to assist climate services for a multitude of climate-impacted sectors.
The El Niño Southern Oscillation is considered to be the most significant form of “natural” climate variability, although its definition and the scientific understanding of the phenomenon are continually evolving. Since its first recorded usage in 1891, the meaning of “El Niño” has morphed from a regular local current affecting coastal Peru, to an occasional Pacific-wide phenomenon that modifies weather patterns throughout the world, and finally to a diversity of weather patterns that share similarities in Pacific heating and changes in trade-wind intensity, but exhibit considerable variation in other ways. Since the 1960s El Niño has been associated with the Southern Oscillation, originally defined as a statistical relationship in pressure patterns across the Pacific by the British-Indian scientist Gilbert Walker. The first unified model for the El Niño-Southern Oscillation (ENSO) was developed by Jacob Bjerknes in 1969 and it has been updated several times since, but no simple model yet explains apparent diversity in El Niño events. ENSO forecasting is considered a success, but each event still displays surprising characteristics.
Sharon E. Nicholson
Classic paradigms describing meteorological phenomena and climate have changed dramatically over the last half-century. This is particularly true for the continent of Africa. Our understanding of its climate is today very different from that which prevailed as recently as the 1960s or 1970s. This article traces the development of relevant paradigms in five broad areas: climate and climate classification, tropical atmospheric circulation, tropical rain-bearing systems, climatic variability and change, and land surface processes and climate. One example is the definition of climate. Originally viewed as simple statistical averages, it is now recognized as an environmental variable with global linkages, multiple timescales of variability, and strong controls via earth surface processes. As a result of numerous field experiments, our understanding of tropical rainfall has morphed from the belief in the domination by local thunderstorms to recognition of vast systems on regional to global scales. Our understanding of the interrelationships with land surface processes has also changed markedly. The simple Charney hypothesis concerning albedo change and the related concept of desertification have given way to a broader view of land–atmosphere interaction. In summary, there has been a major evolution in the way we understand climate, climatic variability, tropical rainfall regimes and rain-bearing systems, and potential human impacts on African climate. Each of these areas has evolved in complexity and understanding, a result of an explosive growth in research and the availability of such investigative tools as satellites, computers, and numerical models.
Throughout history human societies have been shaped and sculpted by the weather conditions that they faced. More than just the physical parameters imposed by the weather itself, how individuals, communities, and whole societies have imagined and understood the weather has influenced many facets of human activity, from agriculture to literary culture. Whether through direct lived experiences, oral traditions and stories, or empirical scientific data these different ways of understanding meteorological conditions have served a multitude of functions in society, from the pragmatic to the moral.
While developments made in the scientific understanding of the atmosphere over the last 300 years have been demonstrably beneficial to most communities, their rapid onset and spread across different societies often came at the expense of older ways of knowing. Therefore, the late 20th century turn to emphasizing the importance of and interrogating and incorporating of traditional ecological knowledge within meteorological frameworks and discourses was essential. This scholarly research, underway across a number of disciplines across the humanities and beyond, not only aides the top-down integration and reach of mitigation and adaptation plans in response to the threat posed by anthropogenic climate change; it also enables the bottom-up flow of forgotten or overlooked knowledge, which helps to refine and improve our scientific understanding of global environmental systems.
Charles A. Doswell III
Convective storms are the result of a disequilibrium created by solar heating in the presence of abundant low-level moisture, resulting in the development of buoyancy in ascending air. Buoyancy typically is measured by the Convective Available Potential Energy (CAPE) associated with air parcels. When CAPE is present in an environment with strong vertical wind shear (winds changing speed and/or direction with height), convective storms become increasingly organized and more likely to produce hazardous weather: strong winds, large hail, heavy precipitation, and tornadoes.
Because of their associated hazards and their impact on society, in some nations (notably, the United States), there arose a need to have forecasts of convective storms. Pre-20th-century efforts to forecast the weather were hampered by a lack of timely weather observations and by the mathematical impossibility of direct solution of the equations governing the weather. The first severe convective storm forecaster was J. P. Finley, who was an Army officer, and he was ordered to cease his efforts at forecasting in 1887. Some Europeans like Alfred Wegener studied tornadoes as a research topic, but there was no effort to develop convective storm forecasting.
World War II aircraft observations led to the recognition of limited storm science in the topic of convective storms, leading to a research program called the Thunderstorm Product that concentrated diverse observing systems to learn more about the structure and evolution of convective storms. Two Air Force officers, E. J. Fawbush and R. C. Miller, issued the first tornado forecasts in the modern era, and by 1953 the U.S. Weather Bureau formed a Severe Local Storms forecasting unit (SELS, now designated the Storm Prediction Center of the National Weather Service). From the outset of the forecasting efforts, it was evident that more convective storm research was needed. SELS had an affiliated research unit called the National Severe Storms Project, which became the National Severe Storms Laboratory in 1963. Thus, research and operational forecasting have been partners from the outset of the forecasting efforts in the United States—with major scientific contributions from the late T. T. Fujita (originally from Japan), K. A. Browning (from the United Kingdom), R. A. Maddox, J. M. Fritsch, C. F. Chappell, J. B. Klemp, L. R. Lemon, R. B. Wilhelmson, R. Rotunno, M. Weisman, and numerous others. This has resulted in the growth of considerable scientific understanding about convective storms, feeding back into the improvement in convective storm forecasting since it began in the modern era. In Europe, interest in both convective storm forecasting and research has produced a European Severe Storms Laboratory and an experimental severe convective storm forecasting group.
The development of computers in World War II created the ability to make numerical simulations of convective storms and numerical weather forecast models. These have been major elements in the growth of both understanding and forecast accuracy. This will continue indefinitely.
A typhoon is a highly organized storm system that develops from initial cyclone eddies and matures by sucking up from the warm tropical oceans large quantities of water vapor that condense at higher altitudes. This latent heat of condensation is the prime source of energy supply that strengthens the typhoon as it progresses across the Pacific Ocean. A typhoon differs from other tropical cyclones only on the basis of location. While hurricanes form in the Atlantic Ocean and eastern North Pacific Ocean, typhoons develop in the western North Pacific around the Philippines, Japan, and China.
Because of their violent histories with strong winds and torrential rains and their impact on society, the countries that ring the North Pacific basin—China, Japan, Korea, the Philippines, and Taiwan—all often felt the need for producing typhoon forecasts and establishing storm warning services. Typhoon accounts in the pre-instrumental era were normally limited to descriptions of damage and incidences, and subsequent studies were hampered by the impossibility of solving the equations governing the weather, as they are distinctly nonlinear. The world’s first typhoon forecast was made in 1879 by Fr. Federico Faura, who was a Jesuit scientist from the Manila Observatory. His brethren from the Zikawei Jesuit Observatory, Fr. Marc Dechevrens, first reconstructed the trajectory of a typhoon in 1879, a study that marked the beginning of an era. The Jesuits and other Europeans like William Doberck studied typhoons as a research topic, and their achievements are regarded as products of colonial meteorology.
Between the First and Second World Wars, there were important contributions to typhoon science by meteorologists in the Philippines (Ch. Deppermann, M. Selga, and J. Coronas), China (E. Gherzi), and Japan (T. Okada, and Y. Horiguti). The polar front theory developed by the Bergen School in Norway played an important role in creating the large-scale setting for tropical cyclones. Deppermann became the greatest exponent of the polar front theory and air-masses analysis in the Far East and Southeast Asia.
From the end of WWII, it became evident that more effective typhoon forecasts were needed to meet military demands. In Hawaii, a joint Navy and Air Force center for typhoon analysis and forecasting was established in 1959—the Joint Typhoon Warning Center (JTWC). Its goals were to publish annual typhoon summaries and conduct research into tropical cyclone forecasting and detection. Other centers had previously specialized in issuing typhoon warnings and analysis. Thus, research and operational forecasting went hand in hand not only in the American JTWC but also in China (the Hong Kong Observatory, the Macao Meteorological and Geophysical Bureau), Japan (the Regional Specialized Meteorological Center), and the Philippines (Atmospheric, Geophysical and Astronomical Service Administration [PAGASA]). These efforts produced more precise scientific knowledge about the formation, structure, and movement of typhoons. In the 1970s and the 1980s, three new tools for research—three-dimensional numerical cloud models, Doppler radar, and geosynchronous satellite imagery—provided a new observational and dynamical perspective on tropical cyclones. The development of modern computing systems has offered the possibility of making numerical weather forecast models and simulations of tropical cyclones. However, typhoons are not mechanical artifacts, and forecasting their track and intensity remains an uncertain science.
Saji N. Hameed
Discovered at the very end of the 20th century, the Indian Ocean Dipole (IOD) is a mode of natural climate variability that arises out of coupled ocean–atmosphere interaction in the Indian Ocean. It is associated with some of the largest changes of ocean–atmosphere state over the equatorial Indian Ocean on interannual time scales. IOD variability is prominent during the boreal summer and fall seasons, with its maximum intensity developing at the end of the boreal-fall season. Between the peaks of its negative and positive phases, IOD manifests a markedly zonal see-saw in anomalous sea surface temperature (SST) and rainfall—leading, in its positive phase, to a pronounced cooling of the eastern equatorial Indian Ocean, and a moderate warming of the western and central equatorial Indian Ocean; this is accompanied by deficit rainfall over the eastern Indian Ocean and surplus rainfall over the western Indian Ocean. Changes in midtropospheric heating accompanying the rainfall anomalies drive wind anomalies that anomalously lift the thermocline in the equatorial eastern Indian Ocean and anomalously deepen them in the central Indian Ocean. The thermocline anomalies further modulate coastal and open-ocean upwelling, thereby influencing biological productivity and fish catches across the Indian Ocean. The hydrometeorological anomalies that accompany IOD exacerbate forest fires in Indonesia and Australia and bring floods and infectious diseases to equatorial East Africa. The coupled ocean–atmosphere instability that is responsible for generating and sustaining IOD develops on a mean state that is strongly modulated by the seasonal cycle of the Austral-Asian monsoon; this setting gives the IOD its unique character and dynamics, including a strong phase-lock to the seasonal cycle. While IOD operates independently of the El Niño and Southern Oscillation (ENSO), the proximity between the Indian and Pacific Oceans, and the existence of oceanic and atmospheric pathways, facilitate mutual interactions between these tropical climate modes.
The Chinese meteorological records could be traced back to the oracle-bone inscriptions of the Shang Dynasty (c. 1600
Modern meteorological knowledge began to be introduced in China during the late Ming Dynasty (1368–1644
Previous researches have reconstructed the chronologies of the temperature change in China during the past 2,000 years, and the Medieval Warm Period and Little Ice Age were identified. With regard to precipitation variability, yearly charts of dryness/wetness in China for the past 500 years were produced. Several chronologies of dust storm, plum rain (Meiyu), and typhoon were also established. Large volcanic eruptions resulted in short scale abrupt cooling in China during the past 2,000 years. Climatic change was significantly related to the war occurrences and dynastic cycles in historical China.
Timothy M. Shanahan
West Africa is among the most populated regions of the world, and it is predicted to continue to have one of the fastest growing populations in the first half of the 21st century. More than 35% of its GDP comes from agricultural production, and a large fraction of the population faces chronic hunger and malnutrition. Its dependence on rainfed agriculture is compounded by extreme variations in rainfall, including both droughts and floods, which appear to have become more frequent. As a result, it is considered a region highly vulnerable to future climate changes. At the same time, CMIP5 model projections for the next century show a large spread in precipitation estimates for West Africa, making it impossible to predict even the direction of future precipitation changes for this region. To improve predictions of future changes in the climate of West Africa, a better understanding of past changes, and their causes, is needed. Long climate and vegetation reconstructions, extending back to 5−8 Ma, demonstrate that changes in the climate of West Africa are paced by variations in the Earth’s orbit, and point to a direct influence of changes in low-latitude seasonal insolation on monsoon strength. However, the controls on West African precipitation reflect the influence of a complex set of forcing mechanisms, which can differ regionally in their importance, especially when insolation forcing is weak. During glacial intervals, when insolation changes are muted, millennial-scale dry events occur across North Africa in response to reorganizations of the Atlantic circulation associated with high-latitude climate changes. On centennial timescales, a similar response is evident, with cold conditions during the Little Ice Age associated with a weaker monsoon, and warm conditions during the Medieval Climate Anomaly associated with wetter conditions. Land surface properties play an important role in enhancing changes in the monsoon through positive feedback. In some cases, such as the mid-Holocene, the feedback led to abrupt changes in the monsoon, but the response is complex and spatially heterogeneous. Despite advances made in recent years, our understanding of West African monsoon variability remains limited by the dearth of continuous, high- resolution, and quantitative proxy reconstructions, particularly from terrestrial sites.
The history of the Russian Magneto-Meteorological Observatory (RMMO) in Beijing has not been extensively researched. Sources for this information are Russian (the Russian State Historical Archive, Saint Petersburg Branch of the Archive of the Academy of Sciences, Russian National Library) and Chinese (the First Historical Archive of Beijing, the Library of the Shanghai Zikavey Observatory) archives. These archival materials can be scientifically and methodologically analyzed. At the beginning of the 18th century, the Russian Orthodox Mission (ROM) was founded in the territory of Beijing. Existing until 1955, the ROM performed an important role in the development of Russian–Chinese relations. Russian scientists could only work in Beijing through the ROM due to China’s policy of fierce self-isolation. The ROM became the center of Chinese academic studies and the first training school for Russian sinologists. From its very beginning, it was considered not only a church or diplomatic mission but a research center in close cooperation with the Russian Academy of Sciences. In this context, the RMMO made important weather investigations in China and the Far East in the 19th century. The RMMO, as well as its branch stations in China and Mongolia, part of a scientific network, represented an important link between Europe and Asia and was probably the largest geographical scientific network in the world at that time.
Deborah R. Coen
The advent of climate science can be defined as the historical emergence of a research program to study climate according to a modern definition of climate. Climate in this sense: (1) refers not simply to the average state of the atmosphere but also to its variability; (2) is multiscalar, concerned with phenomena ranging from the very small and fast to the very large and slow; and (3) is understood to be influenced by the oceans, lithosphere, cryosphere, and biosphere. Most accounts of the history of climate science to date have focused on the development of computerized general circulation models since World War Two. However, following this definition, the advent of climate science occurred well before the computer age. This entry therefore seeks to dispel the image of climate science as a recent invention and as the preserve of an exclusive, North American elite. The historical roots of today’s knowledge of climate change stretch surprisingly far back into the past and clear across the world, though the geographic focus here is on Europe and North America. The modern science of climate emerged out of interactions between learned and vernacular knowledge traditions, and has simultaneously appropriated and undermined traditional and indigenous forms of climate knowledge. Important precedents emerged in the 17th and 18th centuries, and it was in the late 19th century that a modern science of climate coalesced into a coordinated research program in part through the unification of divergent knowledge traditions around standardized techniques of measurement and analysis.
Anjuli S. Bamzai
In the years following the Second World War, the U.S. government played a prominent role in the support of basic scientific research. The National Science Foundation (NSF) was created in 1950 with the primary mission of supporting fundamental science and engineering, excluding medical sciences. Over the years, the NSF has operated from the “bottom up,” keeping close track of research around the United States and the world while maintaining constant contact with the research community to identify ever-moving horizons of inquiry.
In the 1950s the field of meteorology was something of a poor cousin to the other branches of science; forecasting was considered more of trade than a discipline founded on sound theoretical foundations. Realizing the importance of the field to both the economy and national security, the NSF leadership made a concerted effort to enhance understanding of the global atmospheric circulation. The National Center for Atmospheric Research (NCAR) was established to complement ongoing research efforts in academic institutions; it has played a pivotal role in providing observational and modeling tools to the emerging cadre of researchers in the disciplines of meteorology and atmospheric sciences. As understanding of the predictability of the coupled atmosphere-ocean system grew, the field of climate science emerged as a natural outgrowth of meteorology, oceanography, and atmospheric sciences.
The NSF played a leading role in the implementation of major international programs such as the International Geophysical Year (IGY), the Global Weather Experiment, the World Ocean Circulation Experiment (WOCE) and Tropical Ocean Global Atmosphere (TOGA). Through these programs, understanding of the coupled climate system comprising atmosphere, ocean, land, ice-sheet, and sea ice greatly improved. Consistent with its mission, the NSF supported projects that advanced fundamental knowledge of forcing and feedbacks in the coupled atmosphere-ocean-land system. Research projects have included theoretical, observational, and modeling studies of the following: the general circulation of the stratosphere and troposphere; the processes that govern climate; the causes of climate variability and change; methods of predicting climate variations; climate predictability; development and testing of parameterization of physical processes; numerical methods for use in large-scale climate models; the assembly and analysis of instrumental and/or modeled climate data; data assimilation studies; and the development and use of climate models to diagnose and simulate climate variability and change.
Climate scientists work together on an array of topics spanning time scales from the seasonal to the centennial. The NSF also supports research on the natural evolution of the earth’s climate on geological time scales with the goal of providing a baseline for present variability and future trends. The development of paleoclimate data sets has resulted in longer term data for evaluation of model simulations, analogous to the evaluation using instrumental observations. This has enabled scientists to create transformative syntheses of paleoclimate data and modeling outcomes in order to understand the response of the longer-term and higher magnitude variability of the climate system that is observed in the geological records.
The NSF will continue to address emerging issues in climate and earth-system science through balanced investments in transformative ideas, enabling infrastructure and major facilities to be developed.
International climate negotiations seek to limit warming to an average of two degrees Celsius (2°C). This objective is justified by the claim that scientists have identified two degrees of warming as the point at which climate change becomes dangerous. Climate scientists themselves maintain that while science can provide projections of possible impacts at different levels of warming, determining what constitutes an acceptable level of risk is not a matter to be decided by science alone, but is a value choice to be deliberated upon by societies as a whole. Hence, while climate science can inform debates about how much warming is too much, it cannot provide a definitive answer to that question. In order to fully understand how climate change came to be defined as a phenomenon with a single global dangerous limit of 2°C, it is necessary to incorporate insights from the social sciences.
Political economy, culture, economics, sociology, geography, and social psychology have all played a role in defining what constitutes an acceptable level of climate risk. These perspectives can be applied through the framework of institutional analysis to examine reports from the Intergovernmental Panel on Climate Change and other international organizations. This interdisciplinary approach offers the potential to provide a comprehensive history of how climate science has been interpreted in policy making. An interdisciplinary analysis is also essential in order to move beyond historical description to provide a narrative of considerable explanatory power. Such insights offer a valuable framework for considering current debates about whether or not it will be possible to limit warming to 2°C.
Vienna was a metropolis in the middle of the Danube monarchy of Austria-Hungary and under the rule (1848–1916) of Emperor Franz Joseph I (1830–1916) the city experienced rapid growth and an unprecedented flowering of culture, the arts, architecture and science. The capital of the monarchy, an intellectual melting pot, was a city of distinguished personalities who formed the Second Viennese School of music, the Austrian School of economic thought and many more doctrines, including the ideas of Sigmund Freud, the founder of psychoanalysis. Vienna clearly reflected the zeitgeist of the fin de siècle in its economic, scientific, and cultural heyday.
At the end of the 19th century, meteorology and climatology became recognized scientific disciplines, and dynamical meteorology developed during the first quarter of the 20th century. The fact that imperial Austria took a leading position in these developments mostly owes to the work of renowned scientists of the Central Institute for Meteorology and Geodynamics (Zentralanstalt für Meteorologie und Geodynamik, ZAMG) in Vienna.
The institute was founded in 1851, and the astronomer Karl Kreil (1798–1862) became the first director. One of Kreil’s goals was to ensure that both the central meteorological station and the growing number of new meteorological stations across the entire territory of the Austrian Empire were equipped with all the appropriate instruments. Another important goal was the processing of the existing observations to publish in the institute’s yearbooks. In truth, that was the starting signal for all further scientific developments, including that of the Viennese School of Climatology.
During the first decade of the 1900s, Julius Hann (1839–1921), the third director of the ZAMG, was already being acknowledged as a renowned meteorologist and climatologist. He was a pioneer in gathering and synthesizing global climatological and meteorological data, and his Handbook of Climatology (Handbuch der Klimatologie; Hann, 1883 [Hann, J. (1883). Handbuch der Klimatologie. Stuttgart, Germany: J. Engelhorn]) and Textbook of Meteorology (Hann, 1901 [Hann, J. (1901). Lehrbuch der Meteorologie. Leipzig, Germany: C. H. Tauchnitz]) were standard setters (Davies, 2001 [Davies, H. C. (2001). Vienna and the founding of dynamical meteorology. In C. Hammerl, W. Lenhardt, R. Steinacker, & P. Steinhauser (Eds.), Die Zentralanstalt für Meteorologie und Geodynamik 1851–2001: 150 Jahre Meteorologie und Geophysik in Österreich (pp. 301–312). Graz, Austria: Leykam Buchverlagsgesellschaft]). In Hann’s era, one began to speak of a “Viennese or Austrian school.” Heinrich Ficker, who later became director of the institute, defined its distinguishing characteristic as a school that did not simply adhere to one direction but promoted each direction, every peculiar talent, and the ideas that a meteorologist with necessary characteristics was always present at key turning points in meteorological research.