101-120 of 247 Results

Article

The annual United Nations Climate Change Conferences, officially called Conferences of the Parties (COPs), are the main drivers of media attention to climate change around the world. Even more so than the Rio and Rio+20 “Earth Summits” (1992 and 2012) and the meetings of the Intergovernmental Panel on Climate Change (IPCC), the COPs offer multiple access points for the communicative engagement of all kinds of stakeholders. COPs convene up to 20,000 people in one place for two weeks, including national delegations, civil society and business representatives, scientific organizations, representatives from other international organizations, as well as journalists from around the world. While intergovernmental negotiation under the auspices of the UN Framework Convention on Climate Change (UNFCCC) constitutes the core of COP business, these multifunctional events also offer arenas for civil society mobilization, economic lobbying, as well as expert communication and knowledge transfer. The media image of the COPs emerges as a product of distinct networks of coproduction constituted by journalists, professional communicators from non-governmental organizations (NGOs), and national delegations. Production structures at the COPs are relatively globalized with uniform access rules for journalists from all over the world, a few transnational news agencies dominating distribution of both basic information and news visuals, and dense localized interaction between public relations (PR) professionals and journalists. Photo opportunities created by globally coordinated environmental NGOs meet the selection of journalists much better than the visual strategies pursued by delegation spokespeople. This gives NGOs the upper hand in the visual framing contest, whereas in textual framing NGOs are sidelined and national politicians clearly dominate media coverage. The globalized production environment leads to relatively similar patterns of basic news framing in national media coverage of the COPs that reflect overarching ways of approaching the topic: through a focus on problems and victims; a perspective on civil society demands and solutions; an emphasis on conflict in negotiations; or a focus on the benefits of clean energy production. News narratives, on the other hand, give journalists from different countries more leeway in adapting COP news to national audiences’ presumed interests and preoccupations. Even after the adoption of a new global treaty at COP21 in Paris in 2015 that specifies emission reduction targets for all participating countries, the annual UN Climate Change Conferences are likely to remain in the media spotlight. Future research could look more systematically at the impact of global civil society and media in monitoring the national contributions to climate change mitigation introduced in the Paris Agreement and shoring up even more ambitious commitments needed to reach the goal of keeping global warming well below 2 degrees Celsius as compared to pre-industrial levels.

Article

Environmental organizations have been critically important in publicizing and supplying arguments about climate change, just as with the other environmental issues facing contemporary societies. In their campaigns and activism, environmental groups need to be able to make influential and widely circulated claims about the state of the natural world or the ecological impact of human activities. To do this, they have to “manage” their relationship to science. Environmentalists (in contrast to many other campaigners) are obliged to be science communicators because the convincingness of their message depends on the underlying presumption that their claims have a basis in factual, scientific accuracy. Facing the science and communication challenges of climate change, environmentalists have often found their role to be an unusual one. Unlike in most other ecological campaign areas, they have been committed to defending or bolstering mainstream scientific opinion about the nature and causes of climate change. Nonetheless, they have sought ways of distancing themselves from some of the policy and technological options apparently favored by leading scientific figures. And they have pioneered approaches based more on long-term investment strategies and normative values which, to some degree, allow them to sidestep difficulties associated with the adoption of a subordinate role in the science communication arena.

Article

Community-based adaptation (CBA) to climate change is an approach to adaptation that aims to include vulnerable people in the design and implementation of adaptation measures. The most obvious forms of CBA include simple, but accessible, technologies such as storing freshwater during flooding or raising the level of houses near the sea. It can also include more complex forms of social and economic resilience such as increasing access to a wider range of livelihoods or reducing the vulnerability of social groups that are especially exposed to climate risks. CBA has been promoted by some development nongovernmental organizations (NGOs) and international agencies as a means of demonstrating the importance of participatory and deliberative methods within adaptation to climate change, and the role of longer-term development and social empowerment as ways of reducing vulnerability to climate change. Critics, however, have argued that focusing on “community” initiatives can often be romantic and can give the mistaken impression that communities are homogeneous when in fact they contain many inequalities and social exclusions. Accordingly, many analysts see CBA as an important, but insufficient, step toward the representation of vulnerable local people in climate change policy, but that it also offers useful lessons for a broader transformation to socially inclusive forms of climate change policy, and towards seeing resilience to climate change as lying within socio-economic organization rather than in infrastructure and technology alone.

Article

Storms are characterized by high wind speeds; often large precipitation amounts in the form of rain, freezing rain, or snow; and thunder and lightning (for thunderstorms). Many different types exist, ranging from tropical cyclones and large storms of the midlatitudes to small polar lows, Medicanes, thunderstorms, or tornadoes. They can lead to extreme weather events like storm surges, flooding, high snow quantities, or bush fires. Storms often pose a threat to human lives and property, agriculture, forestry, wildlife, ships, and offshore and onshore industries. Thus, it is vital to gain knowledge about changes in storm frequency and intensity. Future storm predictions are important, and they depend to a great extent on the evaluation of changes in wind statistics of the past. To obtain reliable statistics, long and homogeneous time series over at least some decades are needed. However, wind measurements are frequently influenced by changes in the synoptic station, its location or surroundings, instruments, and measurement practices. These factors deteriorate the homogeneity of wind records. Storm indexes derived from measurements of sea-level pressure are less prone to such changes, as pressure does not show very much spatial variability as wind speed does. Long-term historical pressure measurements exist that enable us to deduce changes in storminess for more than the last 140 years. But storm records are not just compiled from measurement data; they also may be inferred from climate model data. The first numerical weather forecasts were performed in the 1950s. These served as a basis for the development of atmospheric circulation models, which were the first generation of climate models or general-circulation models. Soon afterward, model data was analyzed for storm events and cyclone-tracking algorithms were programmed. Climate models nowadays have reached high resolution and reliability and can be run not just for the past, but also for future emission scenarios which return possible future storm activity.

Article

Content analysis is one of the most frequently used methods in climate change communication research. Studies implementing content analysis investigate how climate change is presented in mass media or other communication content. Quantitative content analysis develops a standardized codebook to code content systematically, which then allows for statistical analysis. Qualitative analysis relies on interpretative methods and a closer reading of the material, often using hermeneutic approaches and taking linguistic features of the text more into account than quantitative analysis. While quantitative analysis—particularly if conducted automatically—can comprise larger samples, qualitative analysis usually entails smaller samples, as it is more detailed. Different types of material—whether online content, campaign material, or climate change imagery—bring about different challenges when studied through content analysis that need to be considered when drawing samples of the material for content analysis. To evaluate the quality of a content analysis measures for reliability and validity are used. Key themes in content analyses of climate change communication are the media’s attention to climate change and the different points of view on global warming as an issue being present in the media coverage. Challenges for content analysis as a method for assessing climate change communication arise from the lack of comparability of the various studies that exist. Multimodal approaches are developed to better adhere to both textual and visual content simultaneously in content analyses of climate change communication.

Article

William Joseph Gutowski and Filippo Giorgi

Regional climate downscaling has been motivated by the objective to understand how climate processes not resolved by global models can influence the evolution of a region’s climate and by the need to provide climate change information to other sectors, such as water resources, agriculture, and human health, on scales poorly resolved by global models but where impacts are felt. There are four primary approaches to regional downscaling: regional climate models (RCMs), empirical statistical downscaling (ESD), variable resolution global models (VARGCM), and “time-slice” simulations with high-resolution global atmospheric models (HIRGCM). Downscaling using RCMs is often referred to as dynamical downscaling to contrast it with statistical downscaling. Although there have been efforts to coordinate each of these approaches, the predominant effort to coordinate regional downscaling activities has involved RCMs. Initially, downscaling activities were directed toward specific, individual projects. Typically, there was little similarity between these projects in terms of focus region, resolution, time period, boundary conditions, and phenomena of interest. The lack of coordination hindered evaluation of downscaling methods, because sources of success or problems in downscaling could be specific to model formulation, phenomena studied, or the method itself. This prompted the organization of the first dynamical-downscaling intercomparison projects in the 1990s and early 2000s. These programs and several others following provided coordination focused on an individual region and an opportunity to understand sources of differences between downscaling models while overall illustrating the capabilities of dynamical downscaling for representing climatologically important regional phenomena. However, coordination between programs was limited. Recognition of the need for further coordination led to the formation of the Coordinated Regional Downscaling Experiment (CORDEX) under the auspices of the World Climate Research Programme (WCRP). Initial CORDEX efforts focused on establishing and performing a common framework for carrying out dynamically downscaled simulations over multiple regions around the world. This framework has now become an organizing structure for downscaling activities around the world. Further efforts under the CORDEX program have strengthened the program’s scientific motivations, such as assessing added value in downscaling, regional human influences on climate, coupled ocean­–land–atmosphere modeling, precipitation systems, extreme events, and local wind systems. In addition, CORDEX is promoting expanded efforts to compare capabilities of all downscaling methods for producing regional information. The efforts are motivated in part by the scientific goal to understand thoroughly regional climate and its change and by the growing need for climate information to assist climate services for a multitude of climate-impacted sectors.

Article

Scientific agreement on climate change has strengthened over the past few decades, with around 97% of publishing climate scientists agreeing that human activity is causing global warming. While scientific understanding has strengthened, a small but persistent proportion of the public actively opposes the mainstream scientific position. A number of factors contribute to this rejection of scientific evidence, with political ideology playing a key role. Conservative think tanks, supported with funding from vested interests, have been and continue to be a prolific source of misinformation about climate change. A major strategy by opponents of climate mitigation policies has been to cast doubt on the level of scientific agreement on climate change, contributing to the gap between public perception of scientific agreement and the 97% expert consensus. This “consensus gap” decreases public support for mitigation policies, demonstrating that misconceptions can have significant societal consequences. While scientists need to communicate the consensus, they also need to be aware of the fact that misinformation can interfere with the communication of accurate scientific information. As a consequence, neutralizing the influence of misinformation is necessary. Two approaches to neutralize misinformation involve refuting myths after they have been received by recipients (debunking) or preemptively inoculating people before they receive misinformation (prebunking). Research indicates preemptive refutation or “prebunking” is more effective than debunking in reducing the influence of misinformation. Guidelines to practically implement responses (both preemptive and reactive) can be found in educational research, cognitive psychology, and a branch of psychological research known as inoculation theory. Synthesizing these separate lines of research yields a coherent set of recommendations for educators and communicators. Clearly communicating scientific concepts, such as the scientific consensus, is important, but scientific explanations should be coupled with inoculating explanations of how that science can be distorted.

Article

The 2°C target for global warming had been under severe scrutiny in the run-up to the climate negotiations in Paris in 2015 (COP21). Clearly, with a remaining carbon budget of 470–1,020 GtCO2eq from 2015 onwards for a 66% probability of stabilizing at concentration levels consistent with remaining below 2°C warming at the end of the 21st century and yearly emissions of about 40 GtCO2 per year, not much room is left for further postponing action. Many of the low stabilization pathways actually resort to the extraction of CO2 from the atmosphere (known as negative emissions or Carbon Dioxide Removal [CDR]), mostly by means of Bioenergy with Carbon Capture and Storage (BECCS): if the biomass feedstock is produced sustainably, the emissions would be low or even carbon-neutral, as the additional planting of biomass would sequester about as much CO2 as is generated during energy generation. If additionally carbon capture and storage is applied, then the emissions balance would be negative. Large BECCS deployment thus facilitates reaching the 2°C target, also allowing for some flexibility in other sectors that are difficult to decarbonize rapidly, such as the agricultural sector. However, the large reliance on BECCS has raised uneasiness among policymakers, the public, and even scientists, with risks to sustainability being voiced as the prime concern. For example, the large-scale deployment of BECCS would require vast areas of land to be set aside for the cultivation of biomass, which is feared to conflict with conservation of ecosystem services and with ensuring food security in the face of a still growing population. While the progress that has been made in Paris leading to an agreement on stabilizing “well below 2°C above pre-industrial levels” and “pursuing efforts to limit the temperature increase to 1.5°C” was mainly motivated by the extent of the impacts, which are perceived to be unacceptably high for some regions already at lower temperature increases, it has to be taken with a grain of salt: moving to 1.5°C will further shrink the time frame to act and BECCS will play an even bigger role. In fact, aiming at 1.5°C will substantially reduce the remaining carbon budget previously indicated for reaching 2°C. Recent research on the biophysical limits to BECCS and also other negative emissions options such as Direct Air Capture indicates that they all run into their respective bottlenecks—BECCS with respect to land requirements, but on the upside producing bioenergy as a side product, while Direct Air Capture does not need much land, but is more energy-intensive. In order to provide for the negative emissions needed for achieving the 1.5°C target in a sustainable way, a portfolio of negative emissions options needs to minimize unwanted effects on non–climate policy goals.

Article

Rasmus Fensholt, Cheikh Mbow, Martin Brandt, and Kjeld Rasmussen

In the past 50 years, human activities and climatic variability have caused major environmental changes in the semi-arid Sahelian zone and desertification/degradation of arable lands is of major concern for livelihoods and food security. In the wake of the Sahel droughts in the early 1970s and 1980s, the UN focused on the problem of desertification by organizing the UN Conference on Desertification (UNCOD) in Nairobi in 1976. This fuelled a significant increase in the often alarmist popular accounts of desertification as well as scientific efforts in providing an understanding of the mechanisms involved. The global interest in the subject led to the nomination of desertification as focal point for one of three international environmental conventions: the UN Convention to Combat Desertification (UNCCD), emerging from the Rio conference in 1992. This implied that substantial efforts were made to quantify the extent of desertification and to understand its causes. Desertification is a complex and multi-faceted phenomenon aggravating poverty that can be seen as both a cause and a consequence of land resource depletion. As reflected in its definition adopted by the UNCCD, desertification is “land degradation in arid, semi-arid[,] and dry sub-humid areas resulting from various factors, including climate variation and human activities” (UN, 1992). While desertification was seen as a phenomenon of relevance to drylands globally, the Sahel-Sudan region remained a region of specific interest and a significant amount of scientific efforts have been invested to provide an empirically supported understanding of both climatic and anthropogenic factors involved. Despite decades of intensive research on human–environmental systems in the Sahel, there is no overall consensus about the severity of desertification and the scientific literature is characterized by a range of conflicting observations and interpretations of the environmental conditions in the region. Earth Observation (EO) studies generally show a positive trend in rainfall and vegetation greenness over the last decades for the majority of the Sahel and this has been interpreted as an increase in biomass and contradicts narratives of a vicious cycle of widespread degradation caused by human overuse and climate change. Even though an increase in vegetation greenness, as observed from EO data, can be confirmed by ground observations, long-term assessments of biodiversity at finer spatial scales highlight a negative trend in species diversity in several studies and overall it remains unclear if the observed positive trends provide an environmental improvement with positive effects on people’s livelihood.

Article

Individuals, both within and between different countries, vary substantially in the extent to which they view climate change as a risk. What could explain such variation in climate change risk perception around the world? Climate change is relatively unique as a risk in the sense that it is difficult for people to experience directly or even detect on a purely perceptual or sensory level. In fact, research across the social and behavioral sciences has shown that although people might correctly perceive some changes in long-term climate conditions, psychological factors are often much more influential in determining how the public perceives the risk of climate change. Indeed, decades of research has shown that cognitive, affective, social, and cultural factors all greatly influence the public’s perception of risk, and that these factors, in turn, often interact with each other in complex ways. Yet, although a wide variety of cognitive, experiential, socio-cultural and demographic characteristics have all proven to be relevant, are there certain factors that systematically stand out in explaining and predicting climate change risk perception around the world? And even if so, what do we mean, exactly, by the term “risk perception” and to what extent does the way in which risk perception is measured influence the outcome? Last but certainly not least, how important is public concern about climate change in determining people’s level of behavioral engagement and policy-support for the issue?

Article

Dramatic climate changes have occurred in the Baltic Sea region caused by changes in orbital movement in the earth–sun system and the melting of the Fennoscandian Ice Sheet. Added to these longer-term changes, changes have occurred at all timescales, caused mainly by variations in large-scale atmospheric pressure systems due to competition between the meandering midlatitude low-pressure systems and high-pressure systems. Here we follow the development of climate science of the Baltic Sea from when observations began in the 18th century to the early 21st century. The question of why the water level is sinking around the Baltic Sea coasts could not be answered until the ideas of postglacial uplift and the thermal history of the earth were better understood in the 19th century and periodic behavior in climate related time series attracted scientific interest. Herring and sardine fishing successes and failures have led to investigations of fishery and climate change and to the realization that fisheries themselves have strongly negative effects on the marine environment, calling for international assessment efforts. Scientists later introduced the concept of regime shifts when interpreting their data, attributing these to various causes. The increasing amount of anoxic deep water in the Baltic Sea and eutrophication have prompted debate about what is natural and what is anthropogenic, and the scientific outcome of these debates now forms the basis of international management efforts to reduce nutrient leakage from land. The observed increase in atmospheric CO2 and its effects on global warming have focused the climate debate on trends and generated a series of international and regional assessments and research programs that have greatly improved our understanding of climate and environmental changes, bolstering the efforts of earth system science, in which both climate and environmental factors are analyzed together. Major achievements of past centuries have included developing and organizing regular observation and monitoring programs. The free availability of data sets has supported the development of more accurate forcing functions for Baltic Sea models and made it possible to better understand and model the Baltic Sea–North Sea system, including the development of coupled land–sea–atmosphere models. Most indirect and direct observations of the climate find great variability and stochastic behavior, so conclusions based on short time series are problematic, leading to qualifications about periodicity, trends, and regime shifts. Starting in the 1980s, systematic research into climate change has considerably improved our understanding of regional warming and multiple threats to the Baltic Sea. Several aspects of regional climate and environmental changes and how they interact are, however, unknown and merit future research.

Article

Nelya Koteyko and Dimitrinka Atanasova

Discourse analysis is an interdisciplinary field of inquiry that has been increasingly used by climate change communication scholars since the late 1990s. In its broadest sense, discourse analysis is the study of the social through analysis of language, including face-to-face talk, written media texts, and documents, as well as images and symbols. Studies in this field encompass a broad range of theories and analytic approaches for investigating meaning. Due to its focus on the sociocultural and political context in which text and talk occur, discourse analysis is pertinent to the concerns of climate change communication scholars as it has the potential to reveal the ideological dimensions of stakeholder beliefs and the dissemination of climate change-related information in the media. In contrast to studies under the rubric of frame analysis and survey-based analyses of public perceptions, this research places emphasis on the situated study of different stakeholders involved in climate change communication. Here attention is paid not only to the content being communicated (e.g., themes) but also to the linguistic forms and contexts that shape language and interaction. Both of these require an understanding of audiences’ cultural, political, and socioeconomic conditions. From the participatory perspective, discourse analysis can therefore illuminate the moral, ethical, and cultural dimensions of the climate change issue.

Article

As climate change becomes an increasingly serious problem, mass media are tasked with educating the public. Documentary films and television shows (also called “edutainment”) have been used for decades to communicate about the natural world so that the public may hopefully become informed about science in a simplified, easy-to-understand way. Although producers ostensibly create environmental documentaries in order to inform and/or advocate, theory development and empirical research is limited and insufficient in explaining how this genre influences audiences and why this genre may or may not be an effective means of science communication. Environmental documentaries have the potential to deeply impact audiences because these films promote learning while viewers are entertained, because engagement with the documentary narrative (story) can overcome biases such as politically driven motivated reasoning (conforming new evidence to existing beliefs) and can leverage biases such as the tendency to rely on affect (emotions) when estimating risks. Documentary storytelling can also enhance learning by connecting the causes and consequences of climate change in a sequential narrative. Climate change is a highly contentious political issue, which is reflected in the diversity of viewpoints found in climate change documentaries despite scientific consensus about the issue. While many of these films serve an educational purpose, others are geared toward advocacy. These advocacy programs aim to mobilize value-congruent audiences to engage in personal and collective action and/or to demand policy change. However, people prefer messages that align with their preexisting values, and so the belief disparity between climate change advocates and deniers grows with increasing media exposure as audiences with different beliefs watch and receive climate change messages in very different ways. Filmmakers and scientists must focus future efforts on creating visually engaging narratives within documentaries to promote both education and advocacy to diverse audiences.

Article

Rasmus Benestad

What are the local consequences of a global climate change? This question is important for proper handling of risks associated with weather and climate. It also tacitly assumes that there is a systematic link between conditions taking place on a global scale and local effects. It is the utilization of the dependency of local climate on the global picture that is the backbone of downscaling; however, it is perhaps easiest to explain the concept of downscaling in climate research if we start asking why it is necessary. Global climate models are our best tools for computing future temperature, wind, and precipitation (or other climatological variables), but their limitations do not let them calculate local details for these quantities. It is simply not adequate to interpolate from model results. However, the models are able to predict large-scale features, such as circulation patterns, El Niño Southern Oscillation (ENSO), and the global mean temperature. The local temperature and precipitation are nevertheless related to conditions taking place over a larger surrounding region as well as local geographical features (also true, in general, for variables connected to weather/climate). This, of course, also applies to other weather elements. Downscaling makes use of systematic dependencies between local conditions and large-scale ambient phenomena in addition to including information about the effect of the local geography on the local climate. The application of downscaling can involve several different approaches. This article will discuss various downscaling strategies and methods and will elaborate on their rationale, assumptions, strengths, and weaknesses. One important issue is the presence of spontaneous natural year-to-year variations that are not necessarily directly related to the global state, but are internally generated and superimposed on the long-term climate change. These variations typically involve phenomena such as ENSO, the North Atlantic Oscillation (NAO), and the Southeast Asian monsoon, which are nonlinear and non-deterministic. We cannot predict the exact evolution of non-deterministic natural variations beyond a short time horizon. It is possible nevertheless to estimate probabilities for their future state based, for instance, on projections with models run many times with slightly different set-up, and thereby to get some information about the likelihood of future outcomes. When it comes to downscaling and predicting regional and local climate, it is important to use many global climate model predictions. Another important point is to apply proper validation to make sure the models give skillful predictions. For some downscaling approaches such as regional climate models, there usually is a need for bias adjustment due to model imperfections. This means the downscaling doesn’t get the right answer for the right reason. Some of the explanations for the presence of biases in the results may be different parameterization schemes in the driving global and the nested regional models. A final underlying question is: What can we learn from downscaling? The context for the analysis is important, as downscaling is often used to find answers to some (implicit) question and can be a means of extracting most of the relevant information concerning the local climate. It is also important to include discussions about uncertainty, model skill or shortcomings, model validation, and skill scores.

Article

S.C. Pryor and A.N. Hahmann

Winds within the atmospheric boundary layer (i.e., near to Earth’s surface) vary across a range of scales from a few meters and sub-second timescales (i.e., the scales of turbulent motions) to extremely large and long-period phenomena (i.e., the primary circulation patterns of the global atmosphere). Winds redistribute momentum and heat, and short- and long-term predictions of wind characteristics have applications to a number of socioeconomic sectors (e.g., engineering infrastructure). Despite its importance, atmospheric flow (i.e., wind) has been subject to less research within the climate downscaling community than variables such as air temperature and precipitation. However, there is a growing comprehension that wind storms are the single biggest source of “weather-related” insurance losses in Europe and North America in the contemporary climate, and that possible changes in wind regimes and intense wind events as a result of global climate non-stationarity are of importance to a variety of potential climate change feedbacks (e.g., emission of sea spray into the atmosphere), ecological impacts (such as wind throw of trees), and a number of other socioeconomic sectors (e.g., transportation infrastructure and operation, electricity generation and distribution, and structural design codes for buildings). There are a number of specific challenges inherent in downscaling wind including, but not limited to, the fact that it has both magnitude (wind speed) and orientation (wind direction). Further, for most applications, it is necessary to accurately downscale the full probability distribution of values at short timescales (e.g., hourly), including extremes, while the mean wind speed averaged over a month or year is of little utility. Dynamical, statistical, and hybrid approaches have been developed to downscale different aspects of the wind climate, but have large uncertainties in terms of high-impact aspects of the wind (e.g., extreme wind speeds and gusts). The wind energy industry is a key application for right-scaled wind parameters and has been a major driver of new techniques to increase fidelity. Many opportunities remain to refine existing downscaling methods, to develop new approaches to improve the skill with which the spatiotemporal scales of wind variability are represented, and for new approaches to evaluate skill in the context of wind climates.

Article

B.N. Goswami and Soumi Chakravorty

Lifeline for about one-sixth of the world’s population in the subcontinent, the Indian summer monsoon (ISM) is an integral part of the annual cycle of the winds (reversal of winds with seasons), coupled with a strong annual cycle of precipitation (wet summer and dry winter). For over a century, high socioeconomic impacts of ISM rainfall (ISMR) in the region have driven scientists to attempt to predict the year-to-year variations of ISM rainfall. A remarkably stable phenomenon, making its appearance every year without fail, the ISM climate exhibits a rather small year-to-year variation (the standard deviation of the seasonal mean being 10% of the long-term mean), but it has proven to be an extremely challenging system to predict. Even the most skillful, sophisticated models are barely useful with skill significantly below the potential limit on predictability. Understanding what drives the mean ISM climate and its variability on different timescales is, therefore, critical to advancing skills in predicting the monsoon. A conceptual ISM model helps explain what maintains not only the mean ISM but also its variability on interannual and longer timescales. The annual ISM precipitation cycle can be described as a manifestation of the seasonal migration of the intertropical convergence zone (ITCZ) or the zonally oriented cloud (rain) band characterized by a sudden “onset.” The other important feature of ISM is the deep overturning meridional (regional Hadley circulation) that is associated with it, driven primarily by the latent heat release associated with the ISM (ITCZ) precipitation. The dynamics of the monsoon climate, therefore, is an extension of the dynamics of the ITCZ. The classical land–sea surface temperature gradient model of ISM may explain the seasonal reversal of the surface winds, but it fails to explain the onset and the deep vertical structure of the ISM circulation. While the surface temperature over land cools after the onset, reversing the north–south surface temperature gradient and making it inadequate to sustain the monsoon after onset, it is the tropospheric temperature gradient that becomes positive at the time of onset and remains strongly positive thereafter, maintaining the monsoon. The change in sign of the tropospheric temperature (TT) gradient is dynamically responsible for a symmetric instability, leading to the onset and subsequent northward progression of the ITCZ. The unified ISM model in terms of the TT gradient provides a platform to understand the drivers of ISM variability by identifying processes that affect TT in the north and the south and influence the gradient. The predictability of the seasonal mean ISM is limited by interactions of the annual cycle and higher frequency monsoon variability within the season. The monsoon intraseasonal oscillation (MISO) has a seminal role in influencing the seasonal mean and its interannual variability. While ISM climate on long timescales (e.g., multimillennium) largely follows the solar forcing, on shorter timescales the ISM variability is governed by the internal dynamics arising from ocean–atmosphere–land interactions, regional as well as remote, together with teleconnections with other climate modes. Also important is the role of anthropogenic forcing, such as the greenhouse gases and aerosols versus the natural multidecadal variability in the context of the recent six-decade long decreasing trend of ISM rainfall.

Article

A. Johannes Dolman, Luis U. Vilasa-Abad, and Thomas A. J. Janssen

Drylands cover around 40% of the land surface on Earth and are inhabited by more than 2 billion people, who are directly dependent on these lands. Drylands are characterized by a highly variable rainfall regime and inherent vegetation-climate feedbacks that can enhance the resilience of the system, but also can amplify disturbances. In that way, the system may get locked into two alternate stable states: one relatively wet and vegetated, and the other dry and barren. The resilience of dryland ecosystems derives from a number of adaptive mechanisms by which the vegetation copes with prolonged water stress, such as hydraulic redistribution. The stochastic nature of both the vegetation dynamics and the rainfall regime is a key characteristic of these systems and affects its management in relation to the feedbacks. How the ecohydrology of the African drylands will change in the future depends on further changes in climate, human disturbances, land use, and the socioeconomic system.

Article

How do economic conditions affect public opinion about climate change? Since the early days of the modern environmental movement, people have debated three main perspectives on how economic conditions impact environmental attitudes. The post-materialism perspective suggests that social and individual affluence leads to increasing concern and demands for action on climate change through long-run cultural change. A second view suggests that attitudes about climate change are shaped largely independently of economic conditions and reflect the emergence of a new environmental paradigm. A third view, associated with ecological modernization theory, suggests that attitudes about climate change are shaped in important ways by short-term economic factors, such as economic self-interest, and are likely to vary among citizens over time. While all of these perspectives have merit, we emphasize the impact of macroeconomic risk and business cycle fluctuations in shaping public attitudes toward climate change and more general aspects of environmental policy. Rising unemployment rates, for example, tend to be associated with declines in concern about environmental problems. This is a trend that is repeated across more than four decades and multiple recessions and recoveries dating back to the 1970s. Although it is obviously a more recently recognized environmental problem, public attitudes about climate change are also affected considerably by short-run economic conditions. This fact can influence the possibilities for policy reform. Through a process of motivated reasoning, in which immediate concerns and preferences to address economic risk lead individuals to adjust other attitudes about the environment, public concerns about climate change have ebbed and flowed with the business cycle. Other economic factors—such as societal affluence, personal employment status, or income—have more limited effects on attitudes about climate change, at least in most developed countries. The impact of economic risk on public attitudes about climate change has important implications for policy reform in democratic societies, because public support matters. While partisanship and ideology are frequently cited as explanations for fluctuating public opinion about climate change, macroeconomic risk offers a complementary explanation, which suggests that the framing and timing of environmental policy initiatives is as important as ideological acceptability. Positioning environmental actions or initiatives in better economic conditions, emphasizing immediate economic benefits, and countering unwarranted beliefs about personal costs, especially during challenging economic circumstances, should improve the prospects for efforts to address climate change.

Article

Over the past two decades, the global news industry has embarked upon a major project of economic, organizational, and technological restructuring. In organizational terms, successive waves of mergers and buyouts have yielded a global news landscape where most of the larger firms are owned by shareholders and run by executives whose singular focus is on rationalizing news production and improving profitability. Although in some cases, these shareholders and executives have used their authority to influence climate coverage directly, more often their goals are non-ideological: reducing labor costs and increasing revenues. At the same time, in a parallel development, the digital media revolution not only has spawned a host of new online competitors but also has cut deeply into the advertising revenue once enjoyed by traditional media firms. Within legacy news organizations, these industrial and technological trends have converged to dramatically intensify the work pressures facing environmental journalists. For example, in an effort to reduce costs, many firms have reduced newsroom staff to a small core of multi-tasking reporters, supported by a wider web of part-time freelancers. In this process, the science and environment beat is often the first to go, with environmental specialists among the first to be reassigned or downsized (and pushed into freelance work). For all reporters, there is increased pressure to produce more stories in less time on multiple media platforms, a trend that, in turn, enhances the power of special interests to influence climate coverage through public relations and other external information subsidies. Due to these converging industrial and technological trends, environmental reporters now work in a new media ecosystem that is complex, subject to contradictory pressures, and in many ways hostile to the production of high-quality climate news. When the environmental beat is cut, climate change often becomes the purview of general assignment reporters who lack experience and expertise. For their part, freelance specialists continue to cover climate news, but their ability to sustain this coverage over the long term is constrained by their part-time status. Finally, although niche climate blogs have provided welcome spaces for environmental journalists to produce in-depth coverage, these outlets usually reach only tiny audiences composed of the already-engaged. In short, without significant action, the regrettable status quo of climate news—that is, an episodic sprinkling of climate coverage scattered across the media ecosystem—will continue indefinitely. Policy-makers should therefore restore long-term institutional and economic support for environmental journalists specializing in climate science and policy.

Article

Climate change and fisheries have significantly changed the Baltic Sea ecosystem, with the demise of Eastern Baltic cod (Gadus morhua callarias) being the signature development. Cod in the Central Baltic Sea collapsed in the late 1980s as a result of low reproductive success and overfishing. Low recruitment and hence small year-classes were not able to compensate for fishing pressures far above sustainable levels. Recruitment failure can be mainly related to the absence of North Sea water inflows to the Central Baltic deep basins. These major Baltic inflows (MBIs) occurred regularly until the 1980s, when their frequency decreased to a decadal pattern, a development attributed to changes in atmospheric circulation patterns. MBIs are needed for ventilation of otherwise stagnating Baltic deep waters, and their absence caused reduced oxygen and salinity levels in cod-spawning habitats, limiting egg and larval survival. Climate change, on the other hand, has promoted a warmer environment richer in zooplanktonic food for larval Baltic sprat (Sprattus sprattus). Resulting large year-classes and low predation by the collapsed cod stock caused an outburst of the sprat stock that cascaded down to the zoo- and phytoplankton trophic levels. Furthermore, a large sprat population controlled cod recruitment and hence hindered a recovery of the stock by predation on cod eggs, limiting cod larval food supply. The change in ecosystem structure and function caused by the collapse of the cod stock was a major part and driver of an ecosystem regime shift in the Central Baltic Sea during the period 1988 to 1993. This reorganization of ecosystem structure involved all trophic levels from piscivorous and planktivorous fish to zoo- and phytoplankton. The observed large-scale ecosystem changes displayed the characteristics of a discontinuous regime shift, initiated by climate-induced changes in the abiotic environment and stabilized by feedback loops in the food web. Discontinuous changes such as regime shifts are characteristically difficult to reverse, and the Baltic ecosystem recently rather shows signs of increasing ecological novelty for which the failed recovery of the cod stock despite a reduction in fishing pressure is a clear symptom. Unusually widespread deficient oxygen conditions in major cod-spawning areas have altered the overall productivity of the population by negatively affecting growth and recruitment. Eutrophication as a consequence of intensive agriculture is the main driver for anoxia in the Baltic Sea amplified by the effects on continuing climate change and stabilized by self-enforcing feedbacks. Developing ecological novelty in the Baltic Sea hence requires true cross-sectoral ecosystem-based management approaches that truly integrate eutrophication combatment, species conservation, and living resources management.