You are looking at 121-140 of 227 articles
The El Niño Southern Oscillation is considered to be the most significant form of “natural” climate variability, although its definition and the scientific understanding of the phenomenon are continually evolving. Since its first recorded usage in 1891, the meaning of “El Niño” has morphed from a regular local current affecting coastal Peru, to an occasional Pacific-wide phenomenon that modifies weather patterns throughout the world, and finally to a diversity of weather patterns that share similarities in Pacific heating and changes in trade-wind intensity, but exhibit considerable variation in other ways. Since the 1960s El Niño has been associated with the Southern Oscillation, originally defined as a statistical relationship in pressure patterns across the Pacific by the British-Indian scientist Gilbert Walker. The first unified model for the El Niño-Southern Oscillation (ENSO) was developed by Jacob Bjerknes in 1969 and it has been updated several times since, but no simple model yet explains apparent diversity in El Niño events. ENSO forecasting is considered a success, but each event still displays surprising characteristics.
Aristita Busuioc and Alexandru Dumitrescu
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Climate Science. Please check back later for the full article.
The concept of statistical downscaling or empirical-statistical downscaling became a distinct and important scientific approach in climate science in recent decades, when the climate change issue and assessment of climate change impact on various social and natural systems have become international challenges. Global climate models are the best tools for estimating future climate conditions. Even if improvements can be made in state-of-the art global climate models, in terms of spatial resolution and their performance in simulation of climate characteristics, they are still skillful only in reproducing large-scale feature of climate variability, such as global mean temperature or various circulation patterns (e.g., the North Atlantic Oscillation). However, these models are not able to provide reliable information on local climate characteristics (mean temperature, total precipitation), especially on extreme weather and climate events. The main reason for this failure is the influence of local geographical features on the local climate, as well as other factors related to surrounding large-scale conditions, the influence of which cannot be correctly taken into consideration by the current dynamical global models.
Impact models, such as hydrological and crop models, need high resolution information on various climate parameters on the scale of a river basin or a farm, scales that are not available from the usual global climate models. Downscaling techniques produce regional climate information on finer scale, from global climate change scenarios, based on the assumption that there is a systematic link between the large-scale and local climate. Two types of downscaling approaches are known: a) dynamical downscaling is based on regional climate models nested in a global climate model; and b) statistical downscaling is based on developing statistical relationships between large-scale atmospheric variables (predictors), available from global climate models, and observed local-scale variables of interest (predictands).
Various types of empirical-statistical downscaling approaches can be placed approximately in linear and nonlinear groupings. The empirical-statistical downscaling techniques focus more on details related to the nonlinear models—their validation, strengths, and weaknesses—in comparison to linear models or the mixed models combining the linear and nonlinear approaches. Stochastic models can be applied to daily and sub-daily precipitation in Romania, with a comparison to dynamical downscaling. Conditional stochastic models are generally specific for daily or sub-daily precipitation as predictand.
A complex validation of the nonlinear statistical downscaling models, selection of the large-scale predictors, model ability to reproduce historical trends, extreme events, and the uncertainty related to future downscaled changes are important issues. A better estimation of the uncertainty related to downscaled climate change projections can be achieved by using ensembles of more global climate models as drivers, including their ability to simulate the input in downscaling models. Comparison between future statistical downscaled climate signals and those derived from dynamical downscaling driven by the same global model, including a complex validation of the regional climate models, gives a measure of the reliability of downscaled regional climate changes.
Jin-Song von Storch
The energetics considerations based on Lorenz’s available potential energy focus on identification and quantification of processes capable of converting external energy sources into the kinetic energy of atmospheric and oceanic general circulations. Generally, these considerations consist of: (a) identifying the relevant energy compartments from which energy can be converted against friction to kinetic energy of motions of interests; (b) formulating for these energy compartments budget equations that describe all possible energy pathways; and (c) identifying the dominant energy pathways using realistic data. In order to obtain a more detailed description of energy pathways, a partitioning of motions, for example, into a “mean” and an “eddy” component, or into a diabatic and an adiabatic component, is used. Since the budget equations do not always suggest the relative importance of all possible pathways, often not even the directions, data that describe the atmospheric and the oceanic state in a sufficiently accurate manner are needed for evaluating the energy pathways. Apart from the complication due to different expressions of , ranging from the original definition by Lorenz in 1955 to its approximations and to more generally defined forms, one has to balance the complexity of the respective budget equations that allows the evaluation of more possible energy pathways, with the quality of data available that allows sufficiently accurate estimates of energy pathways. With regard to the atmosphere, our knowledge, as inferred from the four-box Lorenz energy cycle, has consolidated in the last two decades, by, among other means, using data assimilation products obtained by combining observations with realistic atmospheric general circulation models (AGCMs). The eddy kinetic energy, amounting to slightly less than 50% of the total kinetic energy, is supported against friction through a baroclinic pathway “fueled” by the latitudinally dependent diabatic heating. The mean kinetic energy is supported against friction by converting eddy kinetic energy via inverse cascades. For the ocean, our knowledge is still emerging. The description through the four-box Lorenz energy cycle is approximative and was only estimated from a simulation of a oceanic general circulation models (OGCM) realistically forced at the sea surface, rather than from a data assimilation product. The estimates obtained so far suggest that the oceanic eddy kinetic energy, amounting almost 75% of the total oceanic kinetic energy, is supported against friction through a baroclinic pathway similar to that in the atmosphere. However, the oceanic baroclinic pathway is “fueled” to a considerable extent by converting mean kinetic energy supported by winds into mean available potential energy. Winds are also the direct source of the kinetic energy of the mean circulation, without involving noticeable inverse cascades from transients, at least not for the ocean as a whole. The energetics of oceanic general circulation can also be examined by separating diabatic from adiabatic processes. Such a consideration is thought to be more appropriate for understanding the energetics of the oceanic meridional overturning circulation (MOC), since this circulation is sensitive to density changes induced by diabatic mixing. Further work is needed to quantify the respective energy pathways using realistic data.
John A. Alic
Stabilizing atmospheric greenhouse gases will require very large reductions in energy-related carbon dioxide emissions. This can be achieved only through continuous innovation, aggressive and ongoing. Fast-paced innovation, in turn, depends on rapid and widespread diffusion, adoption, adaptation—in short, on technological learning. These processes are integrally linked, as virtuous circles, through feedback loops embedded in economic markets. The overall dynamics are fundamentally incremental.
Pundits and policymakers, nonetheless, sometimes seem to hope that “breakthroughs” will emerge to sweep existing energy technologies aside. Such hopes are misplaced, for two reasons. If breakthroughs are construed as something “new under the sun,” they are rare and unpredictable, and policymakers have few tools to foster them. Energy technologies, after all, have been intensively explored over the past two centuries: the physical constraints are well understood and there are few reasons to expect research to lead to anything fundamentally new. Infant technologies, second, tend to perform poorly, and to be quite costly. Improvements come over time though technological learning. Inputs to this sort of learning range from field service experience to “just-in-time” research. Economic competition provides much of the driving force.
The dynamics just sketched are broadly representative of the evolutionary paths traced by past energy technologies—wind and steam power, gas turbines, nuclear power, and solar photovoltaic (PV) cells and systems. Similar paths will be followed if prospective innovations such as carbon capture and storage, small nuclear reactors, or schemes for tapping the energy of the world’s oceans begin to mature and diffuse. Over the next several decades, the world should expect to work with existing technologies in various stages of maturation that can and will—because this is inherent in the process of innovation—advance on technical measures of performance (e.g., energy conversion efficiency) and come down in costs (in most cases) through continuous improvement.
This sort of innovation is first and foremost the work of profit-seeking businesses, enterprises that conceive, develop, introduce, and market new technologies. These firms exploit publically funded R&D; just as important historically, government procurements have created initial markets, including the first PV cells and also the gas turbines that many utilities now buy for electric power generation, the early versions of which were based on designs for military aircraft. A major task for energy-climate policy is to create similarly viable market segments in which new and emerging technologies can gain a foothold, as a number of governments have done for battery-electric vehicles. Direct and indirect subsidies—financial preferences as provided in some countries for battery-electric vehicles, and market set-asides, as for biofuels in Europe, Brazil, and the United States—insulate firms from potential competition, creating opportunities to push forward technologically, overcoming early handicaps, such as high costs and poor performance, associated with emerging technologies. The implication: Effective innovation policies must provide powerful incentives for profit-seeking businesses. This is true worldwide, although mechanisms will differ from country to country.
Images are a key part of the climate change communication process. The diverse and interdisciplinary literature on how people engage with visual representations of climate change is reviewed. Images hold particular power for engaging people, as they hold three qualities that differ from other communication devices (such as words or text): they are analogical, they lack an explicit propositional syntax, and they are indexical. These qualities are explored in relation to climate change imagery. A number of visual tropes common to climate change communication (identifiable people; climate change impacts; energy, emissions and pollution; protest; scientific imagery) are examined and the evidence for each of these visual tropes in terms of how they engage particular audiences is reviewed. Two case studies, of polar bear imagery and the “hockey stick” graph image, critically examine iconic imagery associated with climate change, and how and why these types of images may (dis)engage audiences. Six best-practice guidelines for visual climate change communication are presented and three areas for further research in this nascent field are suggested.
Anthony Dudo, Jacob Copple, and Lucy Atkinson
Although there is an abundance of social scientific research focused on public opinion and climate change, there remains much to learn about how individuals come to understand, feel, and behave relative to this issue. Efforts to understand these processes are commonly directed toward media depictions, because media represent a primary conduit through which people encounter information about climate change. The majority of research in this area has focused on news media portrayals of climate change. News media depictions, however, represent only a part of the media landscape, and a relatively small but growing body of work has focused on examining portrayals of climate change in entertainment media (i.e., films, television programs, etc.) and their implications. This article provides a comprehensive overview of this area of research, summarizing what is currently known about portrayals of climate change in entertainment media, the individual-level effects of these portrayals, and areas ripe for future research. Our overview suggests that the extant work has centered primarily on a small subset of high-profile climate change films. Examination of the content of these films has been mostly rhetorical and has often presumed negative audience effects. Studies that specifically set out to explore possible effects have often unearthed evidence suggesting short-term contributions to viewers’ perceptions of climate change, specifically in terms of heightened awareness, concern, and motivation. Improving the breadth and depth of research in this area, we contend, can stem from more robust theorizing, analyses that focus on a more diverse menu of entertainment media and the interactions among them, and increasingly complex analytical efforts to capture long-term effects.
Over the last decade, scholars have devoted significant attention to making climate change communication more effective but less attention to ensuring that it is ethical. This neglect risks blurring the distinction between persuasion and manipulation, generating distrust among audiences, and obscuring the conceptual resources needed to guide communicators.
Three prevailing approaches to moral philosophy can illuminate various ethical considerations involved in communicating climate change. Consequentialism, which evaluates actions as morally right or wrong according to their consequences, is the implicit moral framework shared by many social scientists and policymakers interested in climate change. While consequentialism rightly emphasizes the consequences of communication, its exclusive focus on the effectiveness of communication tends to obscure other moral considerations, such as what communicators owe to audiences as a matter of duty or respect. Deontology better captures these duties and provides grounds for communicating in ways that respect the rights of citizens to deliberate and decide how to act. But because deontology tends to cast ethics as an abstract set of universalizable principles, it often downplays the virtues of character needed to motivate action and apply principles across a variety of contexts. Virtue ethics seeks to overcome the limits of both consequentialism and deontology by focusing on the virtues that individuals and communities need to flourish. While virtue ethics is often criticized for failing to provide a concrete blueprint for action, its conception of moral development and thick vocabulary of virtues and vices offer a robust set of practical and conceptual resources for guiding the actions, attitudes, and relationships that characterize climate change communication. Ultimately, all three approaches highlight moral considerations that should inform the ethics of communicating climate change.
Margaret M. Skutsch
The clean development mechanism of the Kyoto Protocol did not cover projects to reduce emissions from deforestation in developing countries. The reasons were in part technical (the difficulty of accounting for leakage) but mainly the result of fears of many Parties to the United Nations Framework Convention on Climate Change (UNFCCC) that this was a soft (and cheap) option that would discourage interventions for mitigation of emissions from fossil fuels. The alternative idea of a national, performance-based approach to reduced emissions from deforestation (RED) was first developed by research institutes in Brazil and proposed to the UNFCCC in a submission by Papua New Guinea and Costa Rica with technical support from the Environmental Defense Fund in 2005/2006. The idea was to reward countries financially for any decreases in annual rates of deforestation at a national level compared to a baseline that reflected historical rates of loss, through the sale of carbon credits, which as in the case of the Clean Development Mechanism (CDM) would be used as offsets by developed countries to meet their international obligations for emission reduction.
REDD+ as it is now included in the Paris Agreement of 2015 (Article 5) has evolved from this rather simple concept into something much more complex and far-reaching. Degradation was added early on in the negotiation process (REDD) and very soon conservation, sustainable management of forests, and enhancement of forest carbon stocks were also included, hence the “+” in REDD+. The idea of “safeguards” (social, environmental) is now also firmly embedded, and the importance of non-carbon benefits is being underlined in official policy. In the absence of legally binding emission reduction targets in developed countries, the notion of a market approach and offsets is no longer the only or even the main route envisaged. Instead, countries are being encouraged to coordinate financial support from a range of public, private, bilateral, and multilateral sources. The mechanism is still, however, seen as a results-based instrument, although this may not be so clear in alternative policy approaches, such as “joint mitigation and adaptation,” also included in the Paris Agreement.
Outside of the official policy negotiations, there has been a move away from operationalizing REDD+ as a purely forest-based mechanism toward developing a more holistic, landscape-based approach, given that many of the drivers of deforestation and degradation lie outside the forest itself. Countries in the vanguard of REDD+ implementation, such as Mexico, as well as several CGIAR organizations are visualizing REDD+ essentially as sustainable rural development. The central role of communities in the implementation of REDD+, and the importance of secure land tenure in this, have to a large extent been incorporated through the adoption of safeguards, but there remain a few lobbies of indigenous groups that are opposed to the whole nature of REDD+. The challenge of measurability, of both carbon and of non-carbon benefits, is addressed in this article.
Sharon E. Nicholson
Classic paradigms describing meteorological phenomena and climate have changed dramatically over the last half-century. This is particularly true for the continent of Africa. Our understanding of its climate is today very different from that which prevailed as recently as the 1960s or 1970s. This article traces the development of relevant paradigms in five broad areas: climate and climate classification, tropical atmospheric circulation, tropical rain-bearing systems, climatic variability and change, and land surface processes and climate. One example is the definition of climate. Originally viewed as simple statistical averages, it is now recognized as an environmental variable with global linkages, multiple timescales of variability, and strong controls via earth surface processes. As a result of numerous field experiments, our understanding of tropical rainfall has morphed from the belief in the domination by local thunderstorms to recognition of vast systems on regional to global scales. Our understanding of the interrelationships with land surface processes has also changed markedly. The simple Charney hypothesis concerning albedo change and the related concept of desertification have given way to a broader view of land–atmosphere interaction. In summary, there has been a major evolution in the way we understand climate, climatic variability, tropical rainfall regimes and rain-bearing systems, and potential human impacts on African climate. Each of these areas has evolved in complexity and understanding, a result of an explosive growth in research and the availability of such investigative tools as satellites, computers, and numerical models.
Joseph P. Reser and Graham L. Bradley
There is a strong view among climate change researchers and communicators that the persuasive tactic of arousing fear in order to promote precautionary motivation and behavior is neither effective nor appropriate in the context of climate change communication and engagement. Yet the modest research evidence that exists with respect to the use of fear appeals in communicating climate change does not offer adequate empirical evidence—either for or against the efficacy of fear appeals in this context—nor would such evidence adequately address the issue of the appropriateness of fear appeals in climate change communication. Extensive research literatures addressing preparedness, prevention, and behavior change in the areas of public health, marketing, and risk communication generally nonetheless provide consistent empirical support for the qualified effectiveness of fear appeals in persuasive social influence communications and campaigns. It is also noteworthy that the language of climate change communication is typically that of “communication and engagement,” with little explicit reference to targeted social influence or behavior change, although this is clearly implied. Hence underlying and intertwined issues here are those of cogent arguments versus largely absent evidence, and effectiveness as distinct from appropriateness. These matters are enmeshed within the broader contours of the contested political, social, and environmental, issues status of climate change, which jostle for attention in a 24/7 media landscape of disturbing and frightening communications concerning the reality, nature, progression, and implications of global climate change. All of this is clearly a challenge for evaluation research attempting to examine the nature and effectiveness of fear appeals in the context of climate change communication, and for determining the appropriateness of designed fear appeals in climate change communications intended to both engage and influence individuals, communities, and “publics” with respect to the ongoing threat and risks of climate change. There is an urgent need to clearly and effectively communicate the full nature and implications of climate change, in the face of this profound risk and rapidly unfolding reality. All such communications are, inherently, frightening warning messages, quite apart from any intentional fear appeals. How then should we put these arguments, evidence, and challenges “on the table” in our considerations and recommendations for enhancing climate change communication—and addressing the daunting and existential implications of climate change?
Forecasting severe convective weather remains one of the most challenging tasks facing operational meteorology today, especially in the mid-latitudes, where severe convective storms occur most frequently and with the greatest impact. The forecast difficulties reflect, in part, the many different atmospheric processes of which severe thunderstorms are a by-product. These processes occur over a wide range of spatial and temporal scales, some of which are poorly understood and/or are inadequately sampled by observational networks. Therefore, anticipating the development and evolution of severe thunderstorms will likely remain an integral part of national and local forecasting efforts well into the future.
Modern severe weather forecasting began in the 1940s, primarily employing the pattern recognition approach throughout the 1950s and 1960s. Substantial changes in forecast approaches did not come until much later, however, beginning in the 1980s. By the start of the new millennium, significant advances in the understanding of the physical mechanisms responsible for severe weather enabled forecasts of greater spatial and temporal detail. At the same time, technological advances made available model thermodynamic and wind profiles that supported probabilistic forecasts of severe weather threats.
This article provides an updated overview of operational severe local storm forecasting, with emphasis on present-day understanding of the mesoscale processes responsible for severe convective storms, and the application of recent technological developments that have revolutionized some aspects of severe weather forecasting. The presentation, nevertheless, notes that increased understanding and enhanced computer sophistication are not a substitute for careful diagnosis of the current meteorological environment and an ingredients-based approach to anticipating changes in that environment; these techniques remain foundational to successful forecasts of tornadoes, large hail, damaging wind, and flash flooding.
R. J. Trapp
Cumulus clouds are pervasive on earth, and play important roles in the transfer of energy through the atmosphere. Under certain conditions, shallow, nonprecipitating cumuli may grow vertically to occupy a significant depth of the troposphere, and subsequently may evolve into convective storms.
The qualifier “convective” implies that the storms have vertical accelerations that are driven primarily, though not exclusively, by buoyancy over a deep layer. Such buoyancy in the atmosphere arises from local density variations relative to some base state density; the base state is typically idealized as a horizontal average over a large area, which is also considered the environment. Quantifications of atmospheric buoyancy are typically expressed in terms of temperature and humidity, and allow for an assessment of the likelihood that convective clouds will form or initiate. Convection initiation is intimately linked to existence of a mechanism by which air is vertically lifted to realize this buoyancy and thus accelerations. Weather fronts and orography are the canonical lifting mechanisms.
As modulated by an ambient or environmental distribution of temperature, humidity, and wind, weather fronts also facilitate the transition of convective clouds into storms with locally heavy rain, lightning, and other possible hazards. For example, in an environment characterized by winds that are weak and change little with distance above the ground, the storms tend to be short lived and benign. The structure of the vertical drafts and other internal storm processes under weak wind shear—i.e., a small change in the horizontal wind over some vertical distance—are distinct relative to those when the environmental wind shear is strong. In particular, strong wind shear in combination with large buoyancy favors the development of squall lines and supercells, both of which are highly coherent storm types. Besides having durations that may exceed a few hours, both of these storm types tend to be particularly hazardous: squall lines are most apt to generate swaths of damaging “straight-line” winds, and supercells spawn the most intense tornadoes and are responsible for the largest hail. Methods used to predict convective-storm hazards capitalize on this knowledge of storm formation and development.
Mike S. Schäfer and Saffron O'Neill
Framing—selecting certain aspects of a given issue and making them more salient in communication in order to “frame” the issue in a specific way—is a key concept in the study of communication. At the same time, it has been used very differently in scholarship, leading some to declare it a “fractured paradigm,” or an idea whose usefulness has expired. In studies of climate change communication, frame analyses have been used numerous times and in various ways, from formal framing approaches (e.g., episodic vs. thematic framing) to topical frames (both generic and issue-specific). Using methodological approaches of frame analysis from content analysis over discourse analysis and qualitative studies to experimental research, this research has brought valuable insights into media portrayals of climate change in different countries and their effects on audiences—even though it still has limitations that should be remedied in future research.
Although future generations—starting with today’s youth—will bear the brunt of negative effects related to climate change, some research suggests that they have little concern about climate change nor much intention to take action to mitigate its impacts. One common explanation for this indifference and inaction is lack of scientific knowledge. It is often said that youth do not understand the science; therefore, they are not concerned. Indeed, in science educational research, numerous studies catalogue the many misunderstandings students have about climate science. However, this knowledge-deficit perspective is not particularly informative in charting a path forward for climate-change education. This path is important because climate science will be taught in more depth as states adopt the Next Generation Science Standards within the next few years. How do we go about creating the educational experiences that students need to be able to achieve climate-science literacy and feel as if they could take action? First, the literature base in communication, specifically about framing must be considered, to identify potentially more effective ways to craft personally relevant and empowering messages for students within their classrooms.
The warming of the global climate is expected to continue in the 21st century, although the magnitude of change depends on future anthropogenic greenhouse gas emissions and the sensitivity of climate to them. The regional characteristics and impacts of future climate change in the Baltic Sea countries have been explored since at least the 1990s. Later research has supported many findings from the early studies, but advances in understanding and improved modeling tools have made the picture gradually more comprehensive and more detailed. Nevertheless, many uncertainties still remain.
In the Baltic Sea region, warming is likely to exceed its global average, particularly in winter and in the northern parts of the area. The warming will be accompanied by a general increase in winter precipitation, but in summer, precipitation may either increase or decrease, with a larger chance of drying in the southern than in the northern parts of the region. Despite the increase in winter precipitation, the amount of snow is generally expected to decrease, as a smaller fraction of the precipitation falls as snow and midwinter snowmelt episodes become more common. Changes in windiness are very uncertain, although most projections suggest a slight increase in average wind speed over the Baltic Sea. Climatic extremes are also projected to change, but some of the changes will differ from the corresponding change in mean climate. For example, the lowest winter temperatures are expected to warm even more than the winter mean temperature, and short-term summer precipitation extremes are likely to become more severe, even in the areas where the mean summer precipitation does not increase.
The projected atmospheric changes will be accompanied by an increase in Baltic Sea water temperature, reduced ice cover, and, according to most studies, reduced salinity due to increased precipitation and river runoff. The seasonal cycle of runoff will be modified by changes in precipitation and earlier snowmelt. Global-scale sea level rise also will affect the Baltic Sea, but will be counteracted by glacial isostatic adjustment. According to most projections, in the northern parts of the Baltic Sea, the latter will still dominate, leading to a continued, although decelerated, decrease in relative sea level. The changes in the physical environment and climate will have a number of environmental impacts on, for example, atmospheric chemistry, freshwater and marine biogeochemistry, ecosystems, and coastal erosion. However, future environmental change in the region will be affected by several interrelated factors. Climate change is only one of them, and in many cases its effects may be exceeded by other anthropogenic changes.
Debbie Hopkins and Ezra M. Markowitz
Despite scientific consensus on the anthropogenic causation of climate change, and ever-growing knowledge on the biophysical impacts of climate change, there is large variability in public perceptions of and belief in climate change. Public support for national and international climate policy has a strong positive association with certainty that climate change is occurring, human caused, serious, and solvable. Thus to achieve greater acceptance of national climate policy and international agreements, it is important to raise public belief in climate change and understandings of personal climate risk.
Public understandings of climate change and associated risk perceptions have received significant academic attention. This research has been conducted across a range of spatial scales, with particular attention on large-scale, nationally representative surveys to gain insights into country-scale perceptions of climate change. Generalizability of nationally representative surveys allows some degree of national comparison; however, the ability to conduct such comparisons has been limited by the availability of comparative data sets. Consequently, empirical insights have been geographically biased toward Europe and North America, with less understanding of public perceptions of climate change in other geographical settings including the Global South. Moreover, a focus on quantitative surveying techniques can overlook the more nuanced, culturally determined factors that contribute to the construction of climate change perceptions.
The physical and human geographies of climate change are diverse. This is due to the complex spatial dimensions of climate change and includes both the observed and anticipated geographical differentiation in risks, impacts, and vulnerabilities. While country location and national climate can impact upon how climate change is understood, so too will sociocultural factors such as national identity and culture(s). Studies have reported high variability in climate change perceptions, the result of a complex interplay between personal experiences of climate, social norms, and worldviews. Exploring the development of national-scale analyses and their findings over time, and the comparability of national data sets, may provide some insights into the factors that influence public perceptions of climate change and identify national-scale interventions and communications to raise risk perception and understanding of climate change.
Humans are altering the hydrosphere, cryosphere, lithosphere, biosphere, and atmosphere in unprecedented ways. Since the late 1980s, a range of geoscience disciplines (such as climatology and ecology) have shown humans to be a “planetary force.” The scale, scope, and magnitude of people’s combined activities threaten to take the planet’s environmental systems out of their Holocene state. This not only raises new research questions for the academic community (such as “What is the best way for a low-income, low-lying country to adapt to sea-level rise?”). It also invites the community to rethink its role in relation to the societies that fund its research and will experience profound impacts of global environmental change. In turn, this rethink raises the question of what kind of research will best suit a change of role. In recent years some global change researchers have called for a “new social contract.” These calls challenge the “old” social contract wherein academic independence was assured by governments so long as universities produced a succession of benefits to society on the basis of both fundamental (non-applied) research and “use-inspired” inquiry and invention. The new social contract directs global change researchers to produce much more of the latter, namely “decision-relevant” knowledge (for governments and other stakeholders). This means that global change research (GCR) will become less geoscience dominated and include more social science and even humanities content: after all, it is human activities that are both the cause of, and solution to, our planetary maladies. A more applied and people-focused GCR community promises to deliver many benefits in the years ahead. However, there are some problems with the way a new social contract is currently being conceived. Unless these problems are addressed, the GCR community will arguably serve societies worldwide far less well than it could and should do. This review describes the old and new social contract ideas in relation to present and future GCR. It does so both descriptively and in a critically constructive way, presenting arguments for a truly new social contract for GCR.
Jonathon P. Schuldt
Communicating about climate change involves more than choices about which content to convey and how to convey it. It also involves a choice about how to label the issue itself, given the various terms used to represent the issue in public discourse—including “global warming,” “climate change,” and “global environmental change,” among others. An emerging literature in climate change communication and survey methodology has begun to examine the influence of labeling on public perceptions, including the cognitive accessibility of climate-related knowledge, affective responses and related judgments (problem seriousness and personal concern), and certainty that the phenomenon exists. The present article reviews this emerging work, drawing on framing theory and related social-cognitive models of information processing to shed light on the possible mechanisms that underlie labeling effects. In doing so, the article highlights the value of distinguishing between labeling and framing effects in communication research and theory, and calls for additional research into the boundary conditions of these and other labeling effects in science communication.
Catrien Termeer, Arwin van Buuren, Art Dewulf, Dave Huitema, Heleen Mees, Sander Meijerink, and Marleen van Rijswick
Adaptation to climate change is not only a technical issue; above all, it is a matter of governance. Governance is more than government and includes the totality of interactions in which public as well as private actors participate, aiming to solve societal problems. Adaptation governance poses some specific, demanding challenges, such as the context of institutional fragmentation, as climate change involves almost all policy domains and governance levels; the persistent uncertainties about the nature and scale of risks and proposed solutions; and the need to make short-term policies based on long-term projections. Furthermore, adaptation is an emerging policy field with, at least for the time being, only weakly defined ambitions, responsibilities, procedures, routines, and solutions. Many scholars have already shown that complex problems, such as adaptation to climate change, cannot be solved in a straightforward way with actions taken by a hierarchic or monocentric form of governance. This raises the question of how to develop governance arrangements that contribute to realizing adaptation options and increasing the adaptive capacity of society. A series of seven basic elements have to be addressed in designing climate adaptation governance arrangements: the framing of the problem, the level(s) at which to act, the alignment across sectoral boundaries, the timing of the policies, the selection of policy instruments, the organization of the science-policy interface, and the most appropriate form of leadership. For each of these elements, this chapter suggests some tentative design principles. In addition to effectiveness and legitimacy, resilience is an important criterion for evaluating these arrangements. The development of governance arrangements is always context- and time-specific, and constrained by the formal and informal rules of existing institutions.
For several decades, the Sahelian countries have been facing continuing rainfall shortages, which, coupled with anthropogenic factors, have severely disrupted the great ecological balance, leading the area in an inexorable process of desertification and land degradation. The Sahel faces a persistent problem of climate change with high rainfall variability and frequent droughts, and this is one of the major drivers of population’s vulnerability in the region. Communities struggle against severe land degradation processes and live in an unprecedented loss of productivity that hampers their livelihoods and puts them among the populations in the world that are the most vulnerable to climatic change. In response to severe land degradation, 11 countries of the Sahel agreed to work together to address the policy, investment, and institutional barriers to establishing a land-restoration program that addresses climate change and land degradation. The program is called the Pan-Africa Initiative for the Great Green Wall (GGW). The initiative aims at helping to halt desertification and land degradation in the Sahelian zone, improving the lives and livelihoods of smallholder farmers and pastoralists in the area and helping its populations to develop effective adaptation strategies and responses through the use of tree-based development programs. To make the GGW initiative successful, member countries have established a coordinated and integrated effort from the government level to local scales and engaged with many stakeholders. Planning, decision-making, and actions on the ground is guided by participation and engagement, informed by policy-relevant knowledge to address the set of scalable land-restoration practices, and address drivers of land use change in various human-environmental contexts. In many countries, activities specific to achieving the GGW objectives have been initiated in the last five years.