You are looking at 121-140 of 221 articles
John A. Alic
Stabilizing atmospheric greenhouse gases will require very large reductions in energy-related carbon dioxide emissions. This can be achieved only through continuous innovation, aggressive and ongoing. Fast-paced innovation, in turn, depends on rapid and widespread diffusion, adoption, adaptation—in short, on technological learning. These processes are integrally linked, as virtuous circles, through feedback loops embedded in economic markets. The overall dynamics are fundamentally incremental.
Pundits and policymakers, nonetheless, sometimes seem to hope that “breakthroughs” will emerge to sweep existing energy technologies aside. Such hopes are misplaced, for two reasons. If breakthroughs are construed as something “new under the sun,” they are rare and unpredictable, and policymakers have few tools to foster them. Energy technologies, after all, have been intensively explored over the past two centuries: the physical constraints are well understood and there are few reasons to expect research to lead to anything fundamentally new. Infant technologies, second, tend to perform poorly, and to be quite costly. Improvements come over time though technological learning. Inputs to this sort of learning range from field service experience to “just-in-time” research. Economic competition provides much of the driving force.
The dynamics just sketched are broadly representative of the evolutionary paths traced by past energy technologies—wind and steam power, gas turbines, nuclear power, and solar photovoltaic (PV) cells and systems. Similar paths will be followed if prospective innovations such as carbon capture and storage, small nuclear reactors, or schemes for tapping the energy of the world’s oceans begin to mature and diffuse. Over the next several decades, the world should expect to work with existing technologies in various stages of maturation that can and will—because this is inherent in the process of innovation—advance on technical measures of performance (e.g., energy conversion efficiency) and come down in costs (in most cases) through continuous improvement.
This sort of innovation is first and foremost the work of profit-seeking businesses, enterprises that conceive, develop, introduce, and market new technologies. These firms exploit publically funded R&D; just as important historically, government procurements have created initial markets, including the first PV cells and also the gas turbines that many utilities now buy for electric power generation, the early versions of which were based on designs for military aircraft. A major task for energy-climate policy is to create similarly viable market segments in which new and emerging technologies can gain a foothold, as a number of governments have done for battery-electric vehicles. Direct and indirect subsidies—financial preferences as provided in some countries for battery-electric vehicles, and market set-asides, as for biofuels in Europe, Brazil, and the United States—insulate firms from potential competition, creating opportunities to push forward technologically, overcoming early handicaps, such as high costs and poor performance, associated with emerging technologies. The implication: Effective innovation policies must provide powerful incentives for profit-seeking businesses. This is true worldwide, although mechanisms will differ from country to country.
Images are a key part of the climate change communication process. The diverse and interdisciplinary literature on how people engage with visual representations of climate change is reviewed. Images hold particular power for engaging people, as they hold three qualities that differ from other communication devices (such as words or text): they are analogical, they lack an explicit propositional syntax, and they are indexical. These qualities are explored in relation to climate change imagery. A number of visual tropes common to climate change communication (identifiable people; climate change impacts; energy, emissions and pollution; protest; scientific imagery) are examined and the evidence for each of these visual tropes in terms of how they engage particular audiences is reviewed. Two case studies, of polar bear imagery and the “hockey stick” graph image, critically examine iconic imagery associated with climate change, and how and why these types of images may (dis)engage audiences. Six best-practice guidelines for visual climate change communication are presented and three areas for further research in this nascent field are suggested.
Anthony Dudo, Jacob Copple, and Lucy Atkinson
Although there is an abundance of social scientific research focused on public opinion and climate change, there remains much to learn about how individuals come to understand, feel, and behave relative to this issue. Efforts to understand these processes are commonly directed toward media depictions, because media represent a primary conduit through which people encounter information about climate change. The majority of research in this area has focused on news media portrayals of climate change. News media depictions, however, represent only a part of the media landscape, and a relatively small but growing body of work has focused on examining portrayals of climate change in entertainment media (i.e., films, television programs, etc.) and their implications. This article provides a comprehensive overview of this area of research, summarizing what is currently known about portrayals of climate change in entertainment media, the individual-level effects of these portrayals, and areas ripe for future research. Our overview suggests that the extant work has centered primarily on a small subset of high-profile climate change films. Examination of the content of these films has been mostly rhetorical and has often presumed negative audience effects. Studies that specifically set out to explore possible effects have often unearthed evidence suggesting short-term contributions to viewers’ perceptions of climate change, specifically in terms of heightened awareness, concern, and motivation. Improving the breadth and depth of research in this area, we contend, can stem from more robust theorizing, analyses that focus on a more diverse menu of entertainment media and the interactions among them, and increasingly complex analytical efforts to capture long-term effects.
Over the last decade, scholars have devoted significant attention to making climate change communication more effective but less attention to ensuring that it is ethical. This neglect risks blurring the distinction between persuasion and manipulation, generating distrust among audiences, and obscuring the conceptual resources needed to guide communicators.
Three prevailing approaches to moral philosophy can illuminate various ethical considerations involved in communicating climate change. Consequentialism, which evaluates actions as morally right or wrong according to their consequences, is the implicit moral framework shared by many social scientists and policymakers interested in climate change. While consequentialism rightly emphasizes the consequences of communication, its exclusive focus on the effectiveness of communication tends to obscure other moral considerations, such as what communicators owe to audiences as a matter of duty or respect. Deontology better captures these duties and provides grounds for communicating in ways that respect the rights of citizens to deliberate and decide how to act. But because deontology tends to cast ethics as an abstract set of universalizable principles, it often downplays the virtues of character needed to motivate action and apply principles across a variety of contexts. Virtue ethics seeks to overcome the limits of both consequentialism and deontology by focusing on the virtues that individuals and communities need to flourish. While virtue ethics is often criticized for failing to provide a concrete blueprint for action, its conception of moral development and thick vocabulary of virtues and vices offer a robust set of practical and conceptual resources for guiding the actions, attitudes, and relationships that characterize climate change communication. Ultimately, all three approaches highlight moral considerations that should inform the ethics of communicating climate change.
Margaret M. Skutsch
The clean development mechanism of the Kyoto Protocol did not cover projects to reduce emissions from deforestation in developing countries. The reasons were in part technical (the difficulty of accounting for leakage) but mainly the result of fears of many Parties to the United Nations Framework Convention on Climate Change (UNFCCC) that this was a soft (and cheap) option that would discourage interventions for mitigation of emissions from fossil fuels. The alternative idea of a national, performance-based approach to reduced emissions from deforestation (RED) was first developed by research institutes in Brazil and proposed to the UNFCCC in a submission by Papua New Guinea and Costa Rica with technical support from the Environmental Defense Fund in 2005/2006. The idea was to reward countries financially for any decreases in annual rates of deforestation at a national level compared to a baseline that reflected historical rates of loss, through the sale of carbon credits, which as in the case of the Clean Development Mechanism (CDM) would be used as offsets by developed countries to meet their international obligations for emission reduction.
REDD+ as it is now included in the Paris Agreement of 2015 (Article 5) has evolved from this rather simple concept into something much more complex and far-reaching. Degradation was added early on in the negotiation process (REDD) and very soon conservation, sustainable management of forests, and enhancement of forest carbon stocks were also included, hence the “+” in REDD+. The idea of “safeguards” (social, environmental) is now also firmly embedded, and the importance of non-carbon benefits is being underlined in official policy. In the absence of legally binding emission reduction targets in developed countries, the notion of a market approach and offsets is no longer the only or even the main route envisaged. Instead, countries are being encouraged to coordinate financial support from a range of public, private, bilateral, and multilateral sources. The mechanism is still, however, seen as a results-based instrument, although this may not be so clear in alternative policy approaches, such as “joint mitigation and adaptation,” also included in the Paris Agreement.
Outside of the official policy negotiations, there has been a move away from operationalizing REDD+ as a purely forest-based mechanism toward developing a more holistic, landscape-based approach, given that many of the drivers of deforestation and degradation lie outside the forest itself. Countries in the vanguard of REDD+ implementation, such as Mexico, as well as several CGIAR organizations are visualizing REDD+ essentially as sustainable rural development. The central role of communities in the implementation of REDD+, and the importance of secure land tenure in this, have to a large extent been incorporated through the adoption of safeguards, but there remain a few lobbies of indigenous groups that are opposed to the whole nature of REDD+. The challenge of measurability, of both carbon and of non-carbon benefits, is addressed in this article.
Sharon E. Nicholson
Classic paradigms describing meteorological phenomena and climate have changed dramatically over the last half-century. This is particularly true for the continent of Africa. Our understanding of its climate is today very different from that which prevailed as recently as the 1960s or 1970s. This article traces the development of relevant paradigms in five broad areas: climate and climate classification, tropical atmospheric circulation, tropical rain-bearing systems, climatic variability and change, and land surface processes and climate. One example is the definition of climate. Originally viewed as simple statistical averages, it is now recognized as an environmental variable with global linkages, multiple timescales of variability, and strong controls via earth surface processes. As a result of numerous field experiments, our understanding of tropical rainfall has morphed from the belief in the domination by local thunderstorms to recognition of vast systems on regional to global scales. Our understanding of the interrelationships with land surface processes has also changed markedly. The simple Charney hypothesis concerning albedo change and the related concept of desertification have given way to a broader view of land–atmosphere interaction. In summary, there has been a major evolution in the way we understand climate, climatic variability, tropical rainfall regimes and rain-bearing systems, and potential human impacts on African climate. Each of these areas has evolved in complexity and understanding, a result of an explosive growth in research and the availability of such investigative tools as satellites, computers, and numerical models.
Joseph P. Reser and Graham L. Bradley
There is a strong view among climate change researchers and communicators that the persuasive tactic of arousing fear in order to promote precautionary motivation and behavior is neither effective nor appropriate in the context of climate change communication and engagement. Yet the modest research evidence that exists with respect to the use of fear appeals in communicating climate change does not offer adequate empirical evidence—either for or against the efficacy of fear appeals in this context—nor would such evidence adequately address the issue of the appropriateness of fear appeals in climate change communication. Extensive research literatures addressing preparedness, prevention, and behavior change in the areas of public health, marketing, and risk communication generally nonetheless provide consistent empirical support for the qualified effectiveness of fear appeals in persuasive social influence communications and campaigns. It is also noteworthy that the language of climate change communication is typically that of “communication and engagement,” with little explicit reference to targeted social influence or behavior change, although this is clearly implied. Hence underlying and intertwined issues here are those of cogent arguments versus largely absent evidence, and effectiveness as distinct from appropriateness. These matters are enmeshed within the broader contours of the contested political, social, and environmental, issues status of climate change, which jostle for attention in a 24/7 media landscape of disturbing and frightening communications concerning the reality, nature, progression, and implications of global climate change. All of this is clearly a challenge for evaluation research attempting to examine the nature and effectiveness of fear appeals in the context of climate change communication, and for determining the appropriateness of designed fear appeals in climate change communications intended to both engage and influence individuals, communities, and “publics” with respect to the ongoing threat and risks of climate change. There is an urgent need to clearly and effectively communicate the full nature and implications of climate change, in the face of this profound risk and rapidly unfolding reality. All such communications are, inherently, frightening warning messages, quite apart from any intentional fear appeals. How then should we put these arguments, evidence, and challenges “on the table” in our considerations and recommendations for enhancing climate change communication—and addressing the daunting and existential implications of climate change?
Forecasting severe convective weather remains one of the most challenging tasks facing operational meteorology today, especially in the mid-latitudes, where severe convective storms occur most frequently and with the greatest impact. The forecast difficulties reflect, in part, the many different atmospheric processes of which severe thunderstorms are a by-product. These processes occur over a wide range of spatial and temporal scales, some of which are poorly understood and/or are inadequately sampled by observational networks. Therefore, anticipating the development and evolution of severe thunderstorms will likely remain an integral part of national and local forecasting efforts well into the future.
Modern severe weather forecasting began in the 1940s, primarily employing the pattern recognition approach throughout the 1950s and 1960s. Substantial changes in forecast approaches did not come until much later, however, beginning in the 1980s. By the start of the new millennium, significant advances in the understanding of the physical mechanisms responsible for severe weather enabled forecasts of greater spatial and temporal detail. At the same time, technological advances made available model thermodynamic and wind profiles that supported probabilistic forecasts of severe weather threats.
This article provides an updated overview of operational severe local storm forecasting, with emphasis on present-day understanding of the mesoscale processes responsible for severe convective storms, and the application of recent technological developments that have revolutionized some aspects of severe weather forecasting. The presentation, nevertheless, notes that increased understanding and enhanced computer sophistication are not a substitute for careful diagnosis of the current meteorological environment and an ingredients-based approach to anticipating changes in that environment; these techniques remain foundational to successful forecasts of tornadoes, large hail, damaging wind, and flash flooding.
R. J. Trapp
Cumulus clouds are pervasive on earth, and play important roles in the transfer of energy through the atmosphere. Under certain conditions, shallow, nonprecipitating cumuli may grow vertically to occupy a significant depth of the troposphere, and subsequently may evolve into convective storms.
The qualifier “convective” implies that the storms have vertical accelerations that are driven primarily, though not exclusively, by buoyancy over a deep layer. Such buoyancy in the atmosphere arises from local density variations relative to some base state density; the base state is typically idealized as a horizontal average over a large area, which is also considered the environment. Quantifications of atmospheric buoyancy are typically expressed in terms of temperature and humidity, and allow for an assessment of the likelihood that convective clouds will form or initiate. Convection initiation is intimately linked to existence of a mechanism by which air is vertically lifted to realize this buoyancy and thus accelerations. Weather fronts and orography are the canonical lifting mechanisms.
As modulated by an ambient or environmental distribution of temperature, humidity, and wind, weather fronts also facilitate the transition of convective clouds into storms with locally heavy rain, lightning, and other possible hazards. For example, in an environment characterized by winds that are weak and change little with distance above the ground, the storms tend to be short lived and benign. The structure of the vertical drafts and other internal storm processes under weak wind shear—i.e., a small change in the horizontal wind over some vertical distance—are distinct relative to those when the environmental wind shear is strong. In particular, strong wind shear in combination with large buoyancy favors the development of squall lines and supercells, both of which are highly coherent storm types. Besides having durations that may exceed a few hours, both of these storm types tend to be particularly hazardous: squall lines are most apt to generate swaths of damaging “straight-line” winds, and supercells spawn the most intense tornadoes and are responsible for the largest hail. Methods used to predict convective-storm hazards capitalize on this knowledge of storm formation and development.
Mike S. Schäfer and Saffron O'Neill
Framing—selecting certain aspects of a given issue and making them more salient in communication in order to “frame” the issue in a specific way—is a key concept in the study of communication. At the same time, it has been used very differently in scholarship, leading some to declare it a “fractured paradigm,” or an idea whose usefulness has expired. In studies of climate change communication, frame analyses have been used numerous times and in various ways, from formal framing approaches (e.g., episodic vs. thematic framing) to topical frames (both generic and issue-specific). Using methodological approaches of frame analysis from content analysis over discourse analysis and qualitative studies to experimental research, this research has brought valuable insights into media portrayals of climate change in different countries and their effects on audiences—even though it still has limitations that should be remedied in future research.
Although future generations—starting with today’s youth—will bear the brunt of negative effects related to climate change, some research suggests that they have little concern about climate change nor much intention to take action to mitigate its impacts. One common explanation for this indifference and inaction is lack of scientific knowledge. It is often said that youth do not understand the science; therefore, they are not concerned. Indeed, in science educational research, numerous studies catalogue the many misunderstandings students have about climate science. However, this knowledge-deficit perspective is not particularly informative in charting a path forward for climate-change education. This path is important because climate science will be taught in more depth as states adopt the Next Generation Science Standards within the next few years. How do we go about creating the educational experiences that students need to be able to achieve climate-science literacy and feel as if they could take action? First, the literature base in communication, specifically about framing must be considered, to identify potentially more effective ways to craft personally relevant and empowering messages for students within their classrooms.
The warming of the global climate is expected to continue in the 21st century, although the magnitude of change depends on future anthropogenic greenhouse gas emissions and the sensitivity of climate to them. The regional characteristics and impacts of future climate change in the Baltic Sea countries have been explored since at least the 1990s. Later research has supported many findings from the early studies, but advances in understanding and improved modeling tools have made the picture gradually more comprehensive and more detailed. Nevertheless, many uncertainties still remain.
In the Baltic Sea region, warming is likely to exceed its global average, particularly in winter and in the northern parts of the area. The warming will be accompanied by a general increase in winter precipitation, but in summer, precipitation may either increase or decrease, with a larger chance of drying in the southern than in the northern parts of the region. Despite the increase in winter precipitation, the amount of snow is generally expected to decrease, as a smaller fraction of the precipitation falls as snow and midwinter snowmelt episodes become more common. Changes in windiness are very uncertain, although most projections suggest a slight increase in average wind speed over the Baltic Sea. Climatic extremes are also projected to change, but some of the changes will differ from the corresponding change in mean climate. For example, the lowest winter temperatures are expected to warm even more than the winter mean temperature, and short-term summer precipitation extremes are likely to become more severe, even in the areas where the mean summer precipitation does not increase.
The projected atmospheric changes will be accompanied by an increase in Baltic Sea water temperature, reduced ice cover, and, according to most studies, reduced salinity due to increased precipitation and river runoff. The seasonal cycle of runoff will be modified by changes in precipitation and earlier snowmelt. Global-scale sea level rise also will affect the Baltic Sea, but will be counteracted by glacial isostatic adjustment. According to most projections, in the northern parts of the Baltic Sea, the latter will still dominate, leading to a continued, although decelerated, decrease in relative sea level. The changes in the physical environment and climate will have a number of environmental impacts on, for example, atmospheric chemistry, freshwater and marine biogeochemistry, ecosystems, and coastal erosion. However, future environmental change in the region will be affected by several interrelated factors. Climate change is only one of them, and in many cases its effects may be exceeded by other anthropogenic changes.
Debbie Hopkins and Ezra M. Markowitz
Despite scientific consensus on the anthropogenic causation of climate change, and ever-growing knowledge on the biophysical impacts of climate change, there is large variability in public perceptions of and belief in climate change. Public support for national and international climate policy has a strong positive association with certainty that climate change is occurring, human caused, serious, and solvable. Thus to achieve greater acceptance of national climate policy and international agreements, it is important to raise public belief in climate change and understandings of personal climate risk.
Public understandings of climate change and associated risk perceptions have received significant academic attention. This research has been conducted across a range of spatial scales, with particular attention on large-scale, nationally representative surveys to gain insights into country-scale perceptions of climate change. Generalizability of nationally representative surveys allows some degree of national comparison; however, the ability to conduct such comparisons has been limited by the availability of comparative data sets. Consequently, empirical insights have been geographically biased toward Europe and North America, with less understanding of public perceptions of climate change in other geographical settings including the Global South. Moreover, a focus on quantitative surveying techniques can overlook the more nuanced, culturally determined factors that contribute to the construction of climate change perceptions.
The physical and human geographies of climate change are diverse. This is due to the complex spatial dimensions of climate change and includes both the observed and anticipated geographical differentiation in risks, impacts, and vulnerabilities. While country location and national climate can impact upon how climate change is understood, so too will sociocultural factors such as national identity and culture(s). Studies have reported high variability in climate change perceptions, the result of a complex interplay between personal experiences of climate, social norms, and worldviews. Exploring the development of national-scale analyses and their findings over time, and the comparability of national data sets, may provide some insights into the factors that influence public perceptions of climate change and identify national-scale interventions and communications to raise risk perception and understanding of climate change.
Humans are altering the hydrosphere, cryosphere, lithosphere, biosphere, and atmosphere in unprecedented ways. Since the late 1980s, a range of geoscience disciplines (such as climatology and ecology) have shown humans to be a “planetary force.” The scale, scope, and magnitude of people’s combined activities threaten to take the planet’s environmental systems out of their Holocene state. This not only raises new research questions for the academic community (such as “What is the best way for a low-income, low-lying country to adapt to sea-level rise?”). It also invites the community to rethink its role in relation to the societies that fund its research and will experience profound impacts of global environmental change. In turn, this rethink raises the question of what kind of research will best suit a change of role. In recent years some global change researchers have called for a “new social contract.” These calls challenge the “old” social contract wherein academic independence was assured by governments so long as universities produced a succession of benefits to society on the basis of both fundamental (non-applied) research and “use-inspired” inquiry and invention. The new social contract directs global change researchers to produce much more of the latter, namely “decision-relevant” knowledge (for governments and other stakeholders). This means that global change research (GCR) will become less geoscience dominated and include more social science and even humanities content: after all, it is human activities that are both the cause of, and solution to, our planetary maladies. A more applied and people-focused GCR community promises to deliver many benefits in the years ahead. However, there are some problems with the way a new social contract is currently being conceived. Unless these problems are addressed, the GCR community will arguably serve societies worldwide far less well than it could and should do. This review describes the old and new social contract ideas in relation to present and future GCR. It does so both descriptively and in a critically constructive way, presenting arguments for a truly new social contract for GCR.
Jonathon P. Schuldt
Communicating about climate change involves more than choices about which content to convey and how to convey it. It also involves a choice about how to label the issue itself, given the various terms used to represent the issue in public discourse—including “global warming,” “climate change,” and “global environmental change,” among others. An emerging literature in climate change communication and survey methodology has begun to examine the influence of labeling on public perceptions, including the cognitive accessibility of climate-related knowledge, affective responses and related judgments (problem seriousness and personal concern), and certainty that the phenomenon exists. The present article reviews this emerging work, drawing on framing theory and related social-cognitive models of information processing to shed light on the possible mechanisms that underlie labeling effects. In doing so, the article highlights the value of distinguishing between labeling and framing effects in communication research and theory, and calls for additional research into the boundary conditions of these and other labeling effects in science communication.
Catrien Termeer, Arwin van Buuren, Art Dewulf, Dave Huitema, Heleen Mees, Sander Meijerink, and Marleen van Rijswick
Adaptation to climate change is not only a technical issue; above all, it is a matter of governance. Governance is more than government and includes the totality of interactions in which public as well as private actors participate, aiming to solve societal problems. Adaptation governance poses some specific, demanding challenges, such as the context of institutional fragmentation, as climate change involves almost all policy domains and governance levels; the persistent uncertainties about the nature and scale of risks and proposed solutions; and the need to make short-term policies based on long-term projections. Furthermore, adaptation is an emerging policy field with, at least for the time being, only weakly defined ambitions, responsibilities, procedures, routines, and solutions. Many scholars have already shown that complex problems, such as adaptation to climate change, cannot be solved in a straightforward way with actions taken by a hierarchic or monocentric form of governance. This raises the question of how to develop governance arrangements that contribute to realizing adaptation options and increasing the adaptive capacity of society. A series of seven basic elements have to be addressed in designing climate adaptation governance arrangements: the framing of the problem, the level(s) at which to act, the alignment across sectoral boundaries, the timing of the policies, the selection of policy instruments, the organization of the science-policy interface, and the most appropriate form of leadership. For each of these elements, this chapter suggests some tentative design principles. In addition to effectiveness and legitimacy, resilience is an important criterion for evaluating these arrangements. The development of governance arrangements is always context- and time-specific, and constrained by the formal and informal rules of existing institutions.
For several decades, the Sahelian countries have been facing continuing rainfall shortages, which, coupled with anthropogenic factors, have severely disrupted the great ecological balance, leading the area in an inexorable process of desertification and land degradation. The Sahel faces a persistent problem of climate change with high rainfall variability and frequent droughts, and this is one of the major drivers of population’s vulnerability in the region. Communities struggle against severe land degradation processes and live in an unprecedented loss of productivity that hampers their livelihoods and puts them among the populations in the world that are the most vulnerable to climatic change. In response to severe land degradation, 11 countries of the Sahel agreed to work together to address the policy, investment, and institutional barriers to establishing a land-restoration program that addresses climate change and land degradation. The program is called the Pan-Africa Initiative for the Great Green Wall (GGW). The initiative aims at helping to halt desertification and land degradation in the Sahelian zone, improving the lives and livelihoods of smallholder farmers and pastoralists in the area and helping its populations to develop effective adaptation strategies and responses through the use of tree-based development programs. To make the GGW initiative successful, member countries have established a coordinated and integrated effort from the government level to local scales and engaged with many stakeholders. Planning, decision-making, and actions on the ground is guided by participation and engagement, informed by policy-relevant knowledge to address the set of scalable land-restoration practices, and address drivers of land use change in various human-environmental contexts. In many countries, activities specific to achieving the GGW objectives have been initiated in the last five years.
Some of the major misconceptions in the United States about climate change—such as the focus on scientific uncertainty, the “debate” over whether climate change is caused by humans, and pushback about how severe the consequences might be—can be seen as communications battles. An interesting area within communications is the contrasting use of guilt and shame for climate-related issues. Guilt and shame are social emotions (along with embarrassment, pride, and others), but guilt and shame are also distinct tools. On the one hand, guilt regulates personal behavior, and because it requires a conscience, guilt can be used only against individuals. Shame, on the other hand, can be used against both individuals and groups by calling their behavior out to an audience. Shaming allows citizens to express criticism and social sanctions, attempting to change behavior through social pressure, often because the formal legal system is not holding transgressors accountable. Through the use of guilt and shame we can see manifestations of how we perceive the problem of climate change and who is responsible for it. For instance, in October 2008, Chevron, one of the world’s largest fossil fuel companies, placed advertisements around Washington, DC, public transit stops featuring wholesome-looking, human faces with captions such as “I will unplug things more,” “I will use less energy,” and “I will take my golf clubs out of the trunk.” Six months later, DC activists reworked the slogans by adding to each the phrase “while Chevron pollutes.” This case of corporate advertising and subsequent “adbusting” illustrates the contrast between guilt and shame in climate change communication. Guilt has tended to align with the individualization of responsibility for climate change and has been primarily deployed over issues of climate-related consumption rather than other forms of behavior, such as failure to engage politically. Shame has been used, largely by civil society groups, as a primary tactic against fossil fuel producers, peddlers of climate denial, and industry-backed politicians.
Hail has been identified as the largest contributor to insured losses from thunderstorms globally, with losses costing the insurance industry billions of dollars each year. Yet, of all precipitation types, hail is probably subject to the largest uncertainties. Some might go so far as to argue that observing and forecasting hail is as difficult, if not more difficult, than is forecasting tornadoes. The reasons why hail is challenging are many and varied and reflected by the fact that hailstones display a wide variety of shapes, sizes and internal structures. There is also an important clue in this diversity—nature is telling us that hail can grow by following a wide variety of trajectories within thunderstorms, each having a unique set of conditions. It is because of this complexity that modeling hail growth and forecasting size is so challenging. Consequently, it is understandable that predicting the occurrence and size of hail seems an impossible task.
Through persistence, ingenuity and technology, scientists have made progress in understanding the key ingredients and processes at play. Technological advances mean that we can now, with some confidence, identify those storms that very likely contain hail and even estimate the maximum expected hail size on the ground hours in advance. Even so, there is still much we need to learn about the many intriguing aspects of hail growth.
Charles A. Doswell III
Convective storms are the result of a disequilibrium created by solar heating in the presence of abundant low-level moisture, resulting in the development of buoyancy in ascending air. Buoyancy typically is measured by the Convective Available Potential Energy (CAPE) associated with air parcels. When CAPE is present in an environment with strong vertical wind shear (winds changing speed and/or direction with height), convective storms become increasingly organized and more likely to produce hazardous weather: strong winds, large hail, heavy precipitation, and tornadoes.
Because of their associated hazards and their impact on society, in some nations (notably, the United States), there arose a need to have forecasts of convective storms. Pre-20th-century efforts to forecast the weather were hampered by a lack of timely weather observations and by the mathematical impossibility of direct solution of the equations governing the weather. The first severe convective storm forecaster was J. P. Finley, who was an Army officer, and he was ordered to cease his efforts at forecasting in 1887. Some Europeans like Alfred Wegener studied tornadoes as a research topic, but there was no effort to develop convective storm forecasting.
World War II aircraft observations led to the recognition of limited storm science in the topic of convective storms, leading to a research program called the Thunderstorm Product that concentrated diverse observing systems to learn more about the structure and evolution of convective storms. Two Air Force officers, E. J. Fawbush and R. C. Miller, issued the first tornado forecasts in the modern era, and by 1953 the U.S. Weather Bureau formed a Severe Local Storms forecasting unit (SELS, now designated the Storm Prediction Center of the National Weather Service). From the outset of the forecasting efforts, it was evident that more convective storm research was needed. SELS had an affiliated research unit called the National Severe Storms Project, which became the National Severe Storms Laboratory in 1963. Thus, research and operational forecasting have been partners from the outset of the forecasting efforts in the United States—with major scientific contributions from the late T. T. Fujita (originally from Japan), K. A. Browning (from the United Kingdom), R. A. Maddox, J. M. Fritsch, C. F. Chappell, J. B. Klemp, L. R. Lemon, R. B. Wilhelmson, R. Rotunno, M. Weisman, and numerous others. This has resulted in the growth of considerable scientific understanding about convective storms, feeding back into the improvement in convective storm forecasting since it began in the modern era. In Europe, interest in both convective storm forecasting and research has produced a European Severe Storms Laboratory and an experimental severe convective storm forecasting group.
The development of computers in World War II created the ability to make numerical simulations of convective storms and numerical weather forecast models. These have been major elements in the growth of both understanding and forecast accuracy. This will continue indefinitely.