Show Summary Details

Page of

Printed from Oxford Research Encyclopedias, Climate Science. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 25 September 2022

Countering Climate Science Denial and Communicating Scientific Consensusfree

Countering Climate Science Denial and Communicating Scientific Consensusfree

  • John CookJohn CookGlobal Change Institute, The University of Queensland; School of Psychology, University of Western Australia


Scientific agreement on climate change has strengthened over the past few decades, with around 97% of publishing climate scientists agreeing that human activity is causing global warming. While scientific understanding has strengthened, a small but persistent proportion of the public actively opposes the mainstream scientific position. A number of factors contribute to this rejection of scientific evidence, with political ideology playing a key role. Conservative think tanks, supported with funding from vested interests, have been and continue to be a prolific source of misinformation about climate change. A major strategy by opponents of climate mitigation policies has been to cast doubt on the level of scientific agreement on climate change, contributing to the gap between public perception of scientific agreement and the 97% expert consensus. This “consensus gap” decreases public support for mitigation policies, demonstrating that misconceptions can have significant societal consequences. While scientists need to communicate the consensus, they also need to be aware of the fact that misinformation can interfere with the communication of accurate scientific information. As a consequence, neutralizing the influence of misinformation is necessary. Two approaches to neutralize misinformation involve refuting myths after they have been received by recipients (debunking) or preemptively inoculating people before they receive misinformation (prebunking). Research indicates preemptive refutation or “prebunking” is more effective than debunking in reducing the influence of misinformation. Guidelines to practically implement responses (both preemptive and reactive) can be found in educational research, cognitive psychology, and a branch of psychological research known as inoculation theory. Synthesizing these separate lines of research yields a coherent set of recommendations for educators and communicators. Clearly communicating scientific concepts, such as the scientific consensus, is important, but scientific explanations should be coupled with inoculating explanations of how that science can be distorted.


  • Policy, Politics, and Governance
  • Future Climate Change Scenarios
  • Communication

The Strengthening Scientific Consensus on Human-Caused Global Warming

A number of studies have attempted to quantify the level of agreement among scientific experts about anthropogenic global warming (AGW), defined as the attribution of human activities to the rise in the average global temperature since the mid-20th century. These include surveys of the scientific community (Carlton, Perry-Hill, Huber, & Prokopy, 2015; Doran & Zimmerman, 2009; Verheggen et al., 2014), analyses of public statements by scientists (Anderegg, Prall, Harold, & Schneider, 2010), and analyses of peer-reviewed papers about global climate change (Cook et al., 2013; Oreskes, 2004; Shwed & Bearman, 2010). Surveys that categorize different levels of expertise in climate science consistently find that higher expertise in climate science corresponds to higher consensus on AGW, as visualized in Figure 1. A number of studies find that for the group with the highest level of expertise, namely scientists who publish peer-reviewed climate research, 97% agree that humans are causing global warming.

Figure 1. Level of scientific agreement that humans are causing global warming among scientific groups of varying expertise in climate science.

Source: Cook et al. (2016).

In addition, scientific agreement on AGW has been observed to strengthen over time. Mathematical analysis of citation networks found that consensus in climate papers had formed in the early 1990s (Shwed & Bearman, 2010). Similarly, analysis of the abstracts of climate papers from 1991 to 2011 found that a strong consensus on AGW had already formed in the scientific literature by 1991 and strengthened over the next two decades (Cook et al., 2013).

The strengthening consensus is reflected in the statements of the Intergovernmental Panel on Climate Change (IPCC). Figure 2 shows how the IPCC has issued progressively stronger statements regarding the role of human activity in recent global warming. The Second Assessment Report stated, “[t]he balance of evidence suggests that there is a discernible human influence on the global climate” (Houghton et al., 1996). This position was strengthened in the Third Assessment Report in 2001, reporting over 66% probability that “most of the warming observed over the last 50 years is attributable to human activities” (Houghton et al., 2001). The strongest IPCC statement on attribution comes in the most recent Fifth Assessment Report, reporting over 95% confidence that human activity caused global warming since the mid-20th century (Qin et al., 2014).

Figure 2. Strengthening IPCC statements on attribution of human activity to recent global warming.

Source: Houghton et al. (1996, 2001), Qin et al. (2014), and Solomon et al. (2007).

Despite the strengthening consensus in the scientific community and scientific literature, a small group consisting mostly of non-climate scientists persistently rejects mainstream climate science (Oreskes & Conway, 2011). In order to effectively address the impact of misinformation on climate literacy levels, one needs to understand the nature and drivers of climate science denial.

Attributes and Drivers of Climate Science Denial

In this article, climate science denial is defined as the rejection of the scientific consensus on the existence of global warming, the role of humanity in causing global warming, or the impacts of climate change on society and the environment. These three aspects of denial are labeled by Rahmstorf (2004) as trend, attribution, or impact skepticism (although in this article, the term “skepticism” is deemed inaccurate when used to characterize the process of science denial). Poortinga, Spence, Whitmarsh, Capstick, and Pidgeon (2011) found that the different stages of denial are strongly interrelated, with rejection of one aspect of climate science (i.e., attribution denial) associated with rejection of other aspects of climate science (i.e., trend denial). “Interrelated denial” results in an incoherent scientific understanding, with contradictory arguments simultaneously espoused by people who deny climate science (Cook, 2014; Lewandowsky, 2015).

The overarching categories of trend, attribution, and impact denial expand into a comprehensive array of arguments against the realities of climate change. An expanded version of the taxonomy, taken from (Figure 3), was adopted by Elsasser and Dunlap (2012), who analyzed climate misinformation published by syndicated conservative columnists. They found that the most popular argument adopted by conservative columnists from 2007 to 2010, was “there is no consensus.” As shall be examined later in this article, perceived consensus has been observed to be a strong predictor of perceptions about climate trends, attribution, and impacts.

Figure 3. A taxonomy of climate myths refuted in the online course Denial101x.

Source: Cook et al. (2015b).

Rejection of climate science is not uniform across the planet. One global survey found that climate science acceptance varied across countries, being lowest in Australia, New Zealand, Norway, and the United States (Tranter & Booth, 2015). Another survey found that acceptance of global warming was much lower in the United States and United Kingdom compared to countries such as Japan, Argentina, Italy, Sweden, Canada, and Germany (Pelham, 2009). A striking result from this finding was that self-reported knowledge about climate change did not always correlate with acceptance of AGW. For example, 97% of Americans report some knowledge about global warming while only 49% agree that rising temperatures are a result of human activities. This implies that lack of knowledge is not the only factor driving rejection of AGW (Kahan et al., 2012b).

What are the other factors driving climate science denial? Gifford (2011) coined the term “dragons of inaction” to describe psychological barriers preventing people from being concerned about climate change. Gifford lists many dragons of inaction, including optimism bias (underestimating risk), pessimism bias (underestimating personal efficacy), and psychological distance (discounting events that are perceived to be far away).

A number of studies have found that political ideology is one of the dominant drivers of climate beliefs (Heath & Gifford, 2006; Kahan, Jenkins-Smith, & Braman, 2011; Lewandowsky, Oberauer, & Gignac, 2013b; Stenhouse et al., 2014). Political ideology has been measured in a variety of ways, whether it be party affiliation (Hardisty, Johnson, & Weber, 2010), the degree of support for free, unregulated markets (Heath & Gifford, 2006), or a score on a liberal–conservative scale (McCright, Dunlap, & Xiao, 2013) or on a two-dimensional scale of “hierarchical-individualist” versus “egalitarian-communitarian” (Kahan, Jenkins-Smith, & Braman, 2011).

While climate belief varies across countries, an affiliation with conservative political parties is a consistent predictor of skepticism (Tranter & Booth, 2015). Fundamentally, the psychological mechanism involved is not aversion to the problem of climate change but aversion to the proposed solutions to mitigate climate change. Accepting the scientific evidence that human activity is causing global warming is commonly framed as requiring behavioral and societal changes, such as increased regulation of industry. These types of changes are perceived to be inconsistent with conservative values, such as liberty or small government. This causal link was teased out in an experiment that presented regulation of polluting industries or nuclear energy as two possible solutions to climate change. Among political conservatives, the nuclear energy message had a positive effect on climate attitudes, while the regulation message caused a backfire effect, lowering acceptance of climate change (Campbell & Kay, 2014).

Excerpts of expert interviews about the role of political ideology in climate science denial

Political ideology plays a strong role in attitudes toward climate change, with cultural values influencing the formation of climate beliefs (Kahan, Jenkins-Smith, & Braman, 2011) as well as the selection of media and information sources (Feldman, Myers, Hmielowski, & Leiserowitz, 2014). Nevertheless, the positive effect of climate information (or conversely, the negative effect of misinformation) still plays a significant role in influencing climate literacy levels (Bedford, 2015). The next section offers a brief history of misinformation about climate change and the psychological impact of misinformation.

The Impact of Misinformation

Although climate change has become a highly polarized issue in countries such as the United States (McCright & Dunlap, 2011), this has not always been the case. President George H. W. Bush once pledged to “fight the greenhouse effect with the White House effect” (Peterson, 1989, p. A1). What transformed a bipartisan issue into a highly charged, polarized public debate? A major contributor to this transformation has been the strategic use of misinformation by various political groups and actors.

Conservative think tanks started producing climate change misinformation at prolific levels in the early 1990s (Jacques, Dunlap, & Freeman, 2008). A sharp increase in the number of misleading publications in the 1990s coincided with international efforts to reduce carbon emissions (McCright & Dunlap, 2000). At the same time, public skepticism about global warming increased, suggesting that the misinformation campaign had been effective (Nisbet & Myers, 2007). Allied with conservative groups was the fossil fuel industry, which campaigned to sow confusion about the environmental impact of fossil fuels (Jacques, Dunlap, & Freeman, 2008; Farrell, 2015a, 2015b). An analysis of 91 organizations that disseminate climate misinformation found that from 2003 to 2010, these groups received an average total income of over $900 million per year (Brulle, 2014), though this funding was provided for activities relative to a broad range of policy issues, rather than exclusively climate change.

The scientific consensus has been a focal point for the misinformation campaign. Opponents of climate action have manipulated public perception of the expert consensus for more than two decades through active campaigns to manufacture doubt. In the early 1990s, a fossil fuel organization spent half a million dollars on a campaign to cast doubt on the consensus (Oreskes, 2010). An analysis of conservative op-eds, which are a prolific source of climate misinformation, found that the most frequently repeated myth was “there is no consensus” (Elsasser & Dunlap, 2012). Even as the scientific consensus continues to strengthen, conservative think tanks persist in denying the high level of agreement (Boussalis & Coan, 2016).

The public are further misinformed by the very nature of media coverage of climate change: the tendency of some media outlets historically to provide “balanced” coverage of issues even in the absence of a balance of evidence (rather than opinion) has resulted in disproportionate weight being given to a small minority of contrarian voices who dispute the scientific consensus on AGW (Boykoff & Boykoff, 2004; Painter & Ashe, 2012; Verheggen et al., 2014). In recent years, subsequent studies suggest that the mainstream U.S. press has overwhelmingly emphasized consensus views on climate science (Boykoff, 2007; Nisbet, 2011), yet a strong emphasis on false balance remains at Fox News and News Corp–owned newspapers worldwide (Feldman, Maibach, Roser-Renouf, & Leiserowitz, 2011; McKnight, 2010). Such falsely balanced articles and news presentations have been observed to lower the perceived risk of climate change and the perceived scientific consensus (Kortenkamp & Basten, 2015; Malka, Krosnick, Debell, Pasek, & Schneider, 2009; McCright, Charters, Dentzman, & Dietz, 2016).

Perceived Consensus as a Gateway Belief

Why have opponents of climate action expended so much effort on casting doubt on the scientific consensus? The deliberation behind this strategy is articulated in a 2002 memo from a political strategist, Frank Luntz, who advised Republican politicians that the way to lower public support for climate policies was to cast doubt on the consensus (Luntz, 2002):

Voters believe that there is no consensus about global warming in the scientific community. Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue in the debate.

Luntz’s market research has been borne out by subsequent psychological research. A number of studies have found that perceived consensus about AGW is an important “gateway belief,” which in turn influences a number of other beliefs and attitudes about climate change (Aklin & Urpelainen, 2014; Ding, Maibach, Zhao, Roser-Renouf, & Leiserowitz, 2011; Lewandowsky, Oberauer, & Gignac, 2013b; McCright, Dunlap, & Xiao, 2013; Stenhouse et al., 2014; van der Linden, Leiserowitz, Feinberg, & Maibach, 2015a; van der Linden, Leiserowitz, & Maibach, 2016). A survey of American Meteorological Society members found that perceived consensus was the strongest predictor of global warming views, followed by political ideology (Stenhouse et al., 2014). Among Republicans, perceived consensus is the strongest predictor of belief in global warming (Rolfe-Redding, Maibach, Feldman, & Leiserowitz, 2011). When people understand that climate scientists agree on AGW, they are more likely to accept that global warming is happening, that humans are causing global warming, and that the impacts are serious, and, importantly, they are more likely to support policies to mitigate climate change.

Thus, casting doubt on consensus has the effect of decreasing acceptance of climate change and reducing support for climate policy. Numerous surveys indicate the misinformation campaign targeting scientific consensus has been effective, with the public in many countries believing that there is significant disagreement among climate scientists about whether humans are causing global warming (Kohut, Keeter, Doherty, & Dimock, 2009). Only around one in 10 Americans correctly estimate that more than 90% of climate scientists agree that humans are causing global warming (Leiserowitz, Maibach, Roser-Renouf, Feinberg, & Rosenthal, 2015). Similarly, only 11% of the public in the United Kingdom are aware that nearly all scientists agree with the consensus (Comres, 2014), and a survey of 15 countries found the perceived consensus to be lower than the actual consensus across the board (University of Maryland, 2009).

The gap between the public perception of the consensus and the actual 97% consensus is a barrier delaying support for mitigation policies or, borrowing the metaphor of Gifford (2011), a significant “dragon of inaction.” Closing the consensus gap will remove a pivotal roadblock delaying climate action.

The Efficacy of Consensus Messaging

The role of perceived consensus as a gateway belief underscores the importance that scientists set the record straight by communicating the high level of agreement among climate scientists (Maibach, Myers, & Leiserowitz, 2014). Communicating the 97% consensus has been observed to significantly increase perceived consensus (Cook & Lewandowsky, 2016; Kotcher, Meyers, Maibach, & Leiserowitz, 2014) and increase acceptance of AGW (Bolsen, Leeper, & Shapiro, 2014; Lewandowsky et al., 2013a). In another domain, communicating the consensus about the safety of vaccination increases public support for vaccines (van der Linden, Clarke, & Maibach, 2015). Consensus messaging about climate change also has a neutralizing effect on worldview, causing a stronger increase in climate acceptance among conservatives (Kotcher et al., 2014; Lewandowsky et al., 2013a), although there are mixed results in this area, with one study finding polarization in response to consensus messaging among U.S. (but not Australian) participants (Cook & Lewandowsky, 2016).

Different methods of communicating scientific consensus have been tested experimentally. Among a range of textual variations (e.g., “97%,” “9 out of 10,” or “97 out of 100”), the most effective articulation of consensus was the phrase “[b]ased on the evidence, 97% of climate scientists have concluded that human-caused climate change is happening” (Maibach, Leiserowitz, & Gould, 2013). The pie chart form of communication shown in Figure 4 has been found to be one of the most effective visual communication methods in influencing perceptions that climate change is human-caused, will be harmful, and should be addressed, especially among conservatives (van der Linden, Leiserowitz, Feinberg, & Maibach, 2014).

Figure 4. Communicating the 97% consensus using a pie chart is an effective method of increasing acceptance of AGW. This infographic was created by SJI Associates for the website Note that while this pie chart referred to the 97% consensus among climate papers, the pie charts used in Maibach, Leiserowitz, & Gould (2013) referred to the 97% consensus among climate scientists.

Objections to Consensus Messaging

The publication and subsequent public interest in the 97% consensus found in Cook et al. (2013) has provoked an ongoing scholarly debate into the efficacy of consensus messaging. Such discourse is a valuable part of the scientific process, potentially leading to improved understanding of the psychology of consensus and an increased emphasis on evidence-based science communication.

One objection is that consensus messaging is an argument from authority, where “the credibility and authority of climate science is invoked as a means of persuasion” (Pearce, Brown, Nerlich, & Kiteyko, 2015, p. 6). This argument highlights one potential limitation of appealing to expert opinion, which may come at the expense of educational interventions that increase critical thought and climate literacy. For example, an alternative approach of explaining the mechanism of the greenhouse effect has been observed to increase acceptance of climate change (Ranney & Clark, 2016). Similarly, increased climate literacy has been associated with increased levels of concern about climate change (Bedford, 2015).

However, the fallacy of argument from authority is bypassed in Maibach, Leiserowitz, and Gould (2013), which found that an effective version of consensus messaging emphasized the evidential foundation on which the consensus is based. In this context, it is important that communicators understand the purpose of communicating the scientific consensus, which is not put forward as “proof” of human-caused global warming. Rather, the case for consensus messaging is based on psychological research into how people think about complex scientific issues such as climate change (van der Linden et al., 2015a). In these situations, people rely on expert opinion as a heuristic, or mental shortcut, to inform their views (Petty, 1999). For example, van der Linden, Leiserowitz, Feinberg, and Maibach (2014) found that using a familiar metaphor for consensus (e.g., “If 97% of doctors concluded that your child is sick, would you believe them? 97% of climate scientists have concluded that human-caused climate change is happening.”) was effective in increasing understanding of the scientific consensus. Communication of the state of expert opinion is a reflection of the psychological reality that the lay public do not necessarily process evidence in the same manner or to the same depth as scientists.

In a second critique, Pearce et al. (2015, p. 6) argue that “attempts to substitute climate science for climate politics merely prolong the debate over whether or not the science is ‘sound.’” This argues that when policy is based on scientific evidence, science becomes a target for policy opponents. For example, the impact of the 2009 “Climategate” incident, where climate scientists’ emails were stolen and published online, proved a temporary distraction from efforts to communicate climate science prior to the international climate negotiations in Copenhagen (Anderegg & Goldsmith, 2014).

A counterargument is that the purpose of consensus messaging is precisely to defend against attempts by opponents of climate policy to cast doubt on the science, which has the purpose of distracting public discourse away from a focus on climate solutions (as was recommended in Luntz, 2002). Consensus messaging is one response to this tactic, with the aim of refocusing public discourse onto the topic of appropriate solutions to AGW. Were scientists to cease communicating the consensus, thus allowing the misinformation campaign targeting perceived consensus to continue unopposed, psychological research into the impact of misinformation (McCright, Charters, Dentzman, & Dietz, 2016) indicates that public confusion about the scientific understanding of AGW would deepen and delay further discussion of solutions.

Further, Pearce et al. (2015) argue that consensus messaging restricts the scope of public discussion to topics of settled science, instead suggesting celebration of areas of disagreement in climate science, using “dialogue which is inclusive of human values” (p. 6). Similarly, Hulme (2015) argues that because of uncertainties in future impacts, “[t]he scientific consensus on climate change thus becomes unhelpfully limiting” (p. 895). Focusing on the topic of the human role in influencing the climate system runs the “[danger] of elevating climate as a predictor of future social and ecological change without appreciating the deep contingency of these changes,” Hulme argues (2015, p. 895). These concerns over what is considered “climate reductionism” reflect the philosophy that overcoming denial is achieved by exploring a more diverse and inclusive range of policy options and by employing messengers representing a wider range of social backgrounds (Nisbet, 2014).

A counterargument is that consensus messaging does not preclude communicating broader policy discussion, such as risk-management frames, which emphasize consideration of future uncertainties. On the contrary, the two frames (consensus and risk management) are complementary. A potentially fruitful approach is to use the scientific consensus as a pivot to issues of legitimate disagreement regarding risk assessment or policy discussion. As argued by Corner, Lewandowsky, Phillips, and Roberts (2015, p. 6), “uncertainty at the frontiers of science should not prevent focusing on the ‘knowns,’ in order to establish a common understanding with your audience.” Scientific uncertainty can be exploited to inhibit policy discussion (Freudenburg, Gramling, & Davidson, 2008), necessitating that science communicators strike a balance between communicating uncertainty and consensus.

Another objection to consensus messaging is the assertion that the “public understanding of the climate issue has moved on” since the “pre-2009 world of climate change discourse” (Hulme, 2013). Along these lines, Kahan (2016) argues that “people with opposing cultural outlooks overwhelmingly agree about what ‘climate scientists think’ on numerous specific propositions relating to the causes and consequences of human-caused climate change.” In other words, the objection is that consensus messaging is unnecessary because the public (both conservatives and liberals) are already aware of the scientific consensus.

However, nationally representative surveys have found that public understanding of scientific consensus is low (Cook & Lewandowsky, 2016; Leiserowitz et al., 2015; van der Linden, Leiserowitz, & Maibach, 2016). The difference in perceived consensus varies significantly across political affiliation, with only 5% of conservatives correctly understanding that the scientific consensus is above 90%, compared to 25% of liberals (Leiserowitz et al., 2015). Low perceived consensus is even found among U.S. science teachers, which has the consequence of minority contrarian views being taught to students (Plutzer et al., 2016).

Conversely, Kahan (2015) argues that the lack of a dramatic shift in public perception of consensus over a period when a number of consensus studies have been published (e.g., Anderegg et al., 2010; Cook et al., 2013; Doran & Zimmerman, 2009; Oreskes, 2004; Verheggen et al., 2014) implies that consensus messaging is ineffective. Similarly, Anderegg (2010) argues that quantifying scientific agreement falls short of spurring political action. To explain this stasis hypothesis, Kahan cites research finding that people process evidence in a biased fashion, according to cultural values (Kahan, Jenkins-Smith, & Braman, 2011). Consequently, Kahan argues that consensus messaging results in increased acceptance of climate change among liberals as well as decreased acceptance of climate change among conservatives, with no significant net change in acceptance.

However, in relation to Kahan’s claims of polarization, there are contradictory research findings. Lewandowsky et al. (2013a), Kotcher et al. (2014), and van der Linden et al. (2016) find that consensus messaging has a neutralizing effect, with conservatives showing a greater increase in acceptance of climate change relative to liberals. In particular, van der Linden et al. (2016) comprehensively rules out the polarization hypothesis using a variety of measures of cultural values and social identification, such as a conservative–liberal scale, Fox News viewing habits, prior attitudes toward climate change, and social norm indicators. Cook and Lewandowsky (2016) also find that consensus messaging is neutralizing for Australian participants, although it has a polarizing effect for U.S. participants. But even in this case, negative effects only occurred for a small proportion of the population, with the overall effect on perceived consensus being positive.

Other research indicates that climate information need not be polarizing. Ranney et al. (2016) found that explaining the mechanism causing global warming (the greenhouse effect) or communicating seven climate statistics (e.g., the 97% consensus or 40% reduction in Arctic sea ice) increased acceptance of global warming across the political spectrum, with no observed polarization. Fernbach, Rogers, Fox, and Sloman (2013) found that asking people to provide a mechanistic explanation for global warming resulted in more moderated attitudes, indicating that deeper engagement with the climate issue can reduce polarization. Similarly, climate literacy measured by correctly identifying activities that cause an increase in greenhouse gases (Guy, Kashima, Walker, & O’Neill, 2014) or by true/false questions regarding the greenhouse effect, sea level rise, and climate/weather (Bedford, 2015) has been associated with a weaker relationship between individualistic ideology and acceptance of climate change.

While cultural cognition plays a significant role in informing climate attitudes, it is not the only factor influencing climate perceptions and attitudes. Cook and Lewandowsky (2016) measured perceived consensus as a function of free market support, a belief that is a key dimension of political ideology. Figure 5 shows the strong influence of ideology, but even for participants with low free-market support, who possess no cultural reason to reject climate change, there is still a significant gap between perceived consensus and the actual 97% consensus. This indicates that a significant contribution to the consensus gap is either a deficit of information and/or a surplus of misinformation.

Figure 5. Perception of scientific consensus about AGW versus free-market support (plotted from data in Cook & Lewandowsky, 2015).

Further, there is an apparent conflict between Kahan’s claim of the impotence of consensus messaging and the replicated experimental findings of the efficacy of consensus messaging. To reconcile the two sets of findings, Kahan (2015) and Pearce et al. (2015) argue that consensus messaging studies lack “external validity”; that is, they fail to simulate real-world conditions. There is a degree of merit to this argument. In addition to accurate information about the scientific consensus, the public are also exposed to misinformation casting doubt on the consensus. An “externally valid” experiment should simulate real-world conditions where accurate and misinformation co-exist.

The finding that the positive effect of accurate information can be undone by misinformation has been observed by McCright et al. (2016), who found that the promising frames about climate change were partially neutralized by misinformation. Given the persistent generation of misinformation about the consensus over the past few decades (Boussalis & Coan, 2016; Elsasser & Dunlap, 2012; Oreskes & Conway, 2011), this offers a cogent explanation of why public perception of consensus has not shifted appreciably over the last decade.

The issue of consensus messaging therefore cannot be understood adequately without including the misinformation campaign that seeks to confuse the public about the level of scientific agreement on AGW. Scientists and climate communicators need to address the influence of climate science denial in a manner informed by the social science research investigating how to neutralize the influence of misinformation.

Effective Refutation of Misinformation

While the generation of misinformation is a persistent problem, compounding the issue is the fact that misconceptions are also psychologically difficult to dislodge (for a review, see Lewandowsky, Ecker, Seifert, Schwarz, & Cook (2012). Misconceptions continue to influence people’s reasoning after being retracted or corrected, even when people demonstrably understand, believe, and later remember the retraction (Ecker, Lewandowsky, & Tang, 2010). The persistence of corrected information in people’s reasoning is known as the continued influence effect (Johnson & Seifert, 1994). For example, if an initial assumption about a person (e.g., that they committed a crime) later turns out to be incorrect, the initial invalid assumption will still affect people’s judgments about the person and their evaluation of the criminal incident (Ecker, Lewandowsky, Chang, & Pillai, 2014; Ecker, Lewandowsky, Fenton, & Martin, 2014).

Why does misinformation continue to influence people even after it has been retracted? People build mental models of how the world works and if an important part of that model is removed (e.g., by a retraction), the correction leaves behind a gap in that mental model. People prefer a complete model to an incomplete model, even when the complete model may contain some invalid elements. Consequently, when queried, people continue to rely on the misinformation rather than tolerate a gap in their understanding (Ecker, Lewandowsky, & Tang, 2010; Lewandowsky et al., 2012).

It follows that an effective way of reducing the continued influence effect is to fill the gap created by a retracted myth with a factual alternative (Johnson & Seifert, 1994). An instructive example is a court case where a suspect is exonerated by providing an alternative suspect. A factual alternative needs to explain the causal qualities of the retracted myth (Seifert, 2002). Ideally, the factual alternative should be less complicated and more fluent than the misinformation it dislodges (Chater & Vitanyi, 2003). Lombrozo (2007) found that simple explanations are judged more likely to be true than more complex explanations. Schwarz, Sanna, Skurnik, & Yoon (2007) also found that providing too many counterarguments can potentially backfire, strengthening initial conceptions. The tension between satisfying causal requirements and the need for simplicity is perhaps encapsulated in Einstein’s famous advice on scientific explanations: “Everything should be made as simple as possible but not simpler.”

The cognitive research into the qualities and implementation of refutations is succinctly summarized by Heath and Heath (2007), who recommend that communicators should “fight sticky ideas with stickier ideas” (p. 284). Sticky ideas are messages that are simple, compelling, and memorable. One example of a sticky message is a narrative such as a murder mystery that arouses curiosity and then satisfies it. The way to achieve this is by opening a gap in a person’s knowledge, then filling that gap with new information (Loewenstein, 1994). This approach lends itself to refutations which create a gap in a person’s mental model, then fill that gap with a factual alternative. The implication is that refutation of misinformation need not be seen merely as a necessary evil. If implemented properly, a refutation offers science communicators the opportunity to communicate the science in a compelling, sticky manner. Figure 6 shows how a sticky factual alternative fits into the structure of an effective refutation.

Figure 6. Recommended structure for a refutation.

An example of sticky messaging in the context of climate communication can be found at the website, which was created to refute the myth that global warming had stopped since 1998. Since that year, the planet has continued to accumulate heat at a rate of over 250 trillion joules per second (Nuccitelli, Way, Painting, Church, & Cook, 2012). To communicate this statistical summary of the planetary energy imbalance in a simpler and more concrete manner, it was expressed as the equivalent of four atomic bombs worth of heat every second. This information was made available as an animated widget for embedding in other blogs and website.

While the most important element of a debunking is strong emphasis on a “sticky” factual alternative, it is still often necessary to explicitly refute the misinformation. One risk in mentioning the myth is that it makes people more familiar with the misinformation—the more familiar people are with a piece of information, the more likely they are to think that it’s true (Schwarz et al., 2007). However, this risk can be mitigated by explicitly warning people that you are about to mention the myth (Ecker et al., 2010; Jou & Foreman, 2007; Schul, 1993). A preemptive warning puts the recipient “cognitively on-guard,” reducing the chance that they will be influenced by the misinformation. Figure 6 shows how the explicit mention of misinformation should come only after the factual alternative and an explicit warning about the myth.

Presenting both the factual alternative and the myth creates a conflict—raising the question of how the two conflicting pieces of information can co-exist. Another quality of an effective retraction is explanation of how or why the misinformation was generated in the first place and/or the motivations behind the misinformation (Lewandowsky, Stritzke, Oberauer, & Morales, 2005). Explaining how misinformation came about enables recipients to reconcile the contradiction between the misinformation and the correction (Seifert, 2002). A refutation answers this question—filling the “gap” created by the conflict—by explaining how the misinformation arose or the techniques the misinformer uses to distort the facts. As illustrated in Figure 6, a useful framework for explaining the techniques of denial are the five characteristics of science denial: fake experts, logical fallacies, impossible expectations, cherry picking, and conspiracy theories (Diethelm & McKee, 2009).Additionally, graphics can play a powerful role in refutations. When a refutation conflicts with a person’s pre-existing beliefs, they will seize on ambiguities in the text to construct an alternative explanation. Clear, unambiguous graphics that specify and/or quantify the communicated evidence provide less opportunity for misinterpretation and counterarguing as well as add fluency to a rebuttal. For example, Republicans showed a greater acceptance of global warming when shown a graph of temperature trends compared to a content-equivalent textual description of global warming (Nyhan & Reifler, 2012).

Excerpts of expert interviews about the psychology of misinformation and the techniques of effective refutation

These recommended best practices for debunking can help reduce the influence of misinformation that has already been received by recipients. However, trying to reduce the influence of misinformation once it is lodged in people’s minds is still a difficult exercise (Lewandowsky et al., 2012). Another promising avenue of approach that circumvents this difficulty is preemptively refuting misinformation before it is received by recipients (known as “prebunking”), which has been observed to be more effective in reducing the influence of misinformation (Bolsen & Druckman, 2015).

Inoculation: Prebunking Is the New Debunking

Research indicates that it is more efficient to prevent misinformation from taking root in the first place, rather than trying to undo the damage retroactively. For example, people who were suspicious of the U.S. government’s motives during the Iraq war were less vulnerable to misinformation about the war (Lewandowsky et al., 2005). Similarly, people’s pre-existing attitudes toward a company influenced how they interpreted charitable behavior by that company, with charity by a company with a bad reputation being seen as motivated by self-interest (Bae & Cameron, 2006).

Consequently, an alternative response to retroactively refuting misinformation is to preemptively neutralize the misinformation (prebunking). This approach is informed by inoculation theory (Compton, 2013; McGuire & Papageorgis, 1961), which applies the metaphor of vaccination to knowledge. Just as exposing people to a weak form of a virus builds resistance to a future encounter with the virus, in the same way exposing people to a refuted form of a myth conveys resistance to persuasive misinformation. This occurs by equipping people with counterarguments that expose the logical fallacies contained in the misinforming arguments. As a result, they are better able to recognize and dismiss flawed or misleading arguments. Inoculating messages have been observed to more effectively convey resistance to misinformation compared to “straight science” messages that don’t explicitly address misinformation (Banas & Rains, 2010).

To illustrate, Bolsen and Druckman (2015) found that preemptive warnings about politicizing science can counteract the effects of politicization. By politicization, they mean “emphasizing the inherent uncertainty of science to cast doubt on the existence of scientific consensus” (p. 747), which is to be distinguished from misinformation, which is false information. Subtle distinctions aside, this research was of particular note as it compared the relative efficacy of prebunking versus debunking (refuting the myth after the misinformation) and found that prebunking was more effective in reducing the influence of the misinformation.

An inoculating message requires two elements. First, it should explicitly warn of the threat of misinformation. Second, it should contain refutations of the arguments adopted by the misinformation. Using misinformation about the scientific consensus as an example, an inoculating message could warn of the existence of arguments casting doubt on the scientific consensus on human-caused global warming, then explain the techniques used by these arguments (such as the fallacy of “fake experts”). Armed with the counterarguments enabling one to perceive the misleading nature of misinformation, people acquire resistance and are less vulnerable to being persuaded by the misinformation.

The research into inoculation offers promising avenues for science communicators. Inoculation interventions seem to shift people from a shallow, heuristic mode of thinking to a more considered approach to information processing (Kahneman, 2003). This idea is consistent with the suggestion that science communicators should not just address the information deficit—they must also address the “wisdom deficit,” where “cognitively sophisticated educators can provide the tools that help the public better evaluate the evidence” (Clark, Ranney, & Felipe, 2013, p. 2071). Clark, Ranney, and Felipe (2013) experimentally tested mechanistic explanations of the greenhouse effect to demonstrate the efficacy of promoting a richer understanding of the concept, while also referencing the communication tools and techniques listed in Lewandowsky et al. (2012) for correcting misinformation. Examples of communication techniques include providing factual alternatives to displace refuted myths, fostering healthy skepticism about misinformation sources, and framing evidence in a worldview-affirming manner.

Misconception-Based Learning: Inoculation in an Educational Context

The notion that inoculation stimulates people to engage at a deeper level with scientific information also resonates with a line of educational research known as misconception-based learning. This research finds that teaching science by refuting misconceptions about the science stimulates more cognitive effort and higher engagement with the content, resulting in greater learning gains compared to lessons that do not address misconceptions (Muller, Bewes, Sharma, & Reimann, 2007; Muller, Sharma, & Reimann, 2008).

Correcting scientific misconceptions is an important part of science education. As Osborne (2010) aptly put it, “[c]omprehending why ideas are wrong matters as much as understanding why other ideas might be right.” The approach of addressing misconceptions in an educational context has been referred to in various ways, such as misconception-based learning (McCuin, Hayhoe, & Hayhoe, 2014), agnotology-based learning (Bedford, 2010), or refutational text (Tippett, 2010).

Misconception-based learning involves lessons that directly address and refute misconceptions as well as explain factual information, in contrast to standard lessons that teach the facts without explicitly addressing misconceptions. For example, one myth regarding the carbon cycle is that anthropogenic carbon dioxide (CO2) emissions are inconsequential because they are small in magnitude compared to natural CO2 emissions. A misconception-based learning approach might explain the natural balance inherent in the carbon cycle, with natural CO2 emissions roughly balanced by natural CO2 absorptions, and how anthropogenic CO2 emissions have upset the natural balance. Thus the technique employed by the myth is “cherry picking,” failing to consider the role of natural CO2 absorptions in the carbon cycle. Misconception-based learning has been shown in a number of studies to be one of the most effective means of reducing misconceptions (Kowalski & Taylor, 2009; Muller et al., 2008; Tippett, 2010). This educational approach also achieves long-term conceptual change, lasting from weeks to several months (Guzzetti, Snyder, Glass, & Gamas, 1993).

Part of the power of misconception-based learning is that it not only imparts content concepts, it also addresses epistemological concepts, exploring how knowledge is produced. While both content and epistemology are necessary to bring about lasting conceptual change, education has tended to focus on the former due to the difficult challenge of teaching the latter (Posner, Strike, Hewson, & Gertzog, 1982). Misconception-based learning increases students’ argumentative skills (Kuhn & Crowell, 2011) and encourages students to assess evidence, thus raising critical thinking (Berland & Reiser, 2009; Ecker, Swire, & Lewandowsky, 2014; Kuhn & Crowell, 2011). Students are more interested in refutational texts compared to traditional textbooks (Mason, Gava, & Boldrin, 2008).

Just as the structure of debunking lends itself to compelling, “sticky” science communication, misconception-based learning offers a powerful method of science education. One might thus argue (taking a glass-half-full perspective) that the existence of misinformation about climate change presents an educational opportunity.

The opportunities inherent in misconception-based learning are already being applied in the classroom. One negative influence on climate literacy levels is the “teach the controversy” approach, where both sides of the scientific debate are presented on issues such as climate change and evolution. A survey of U.S. science teachers found that 31% who taught climate change were emphasizing the scientific consensus on human-caused global warming and the fact that many scientists believe global warming was due to natural causes (Plutzer et al., 2016). However, teachers have also repurposed the “teach the controversy” framing in order to educate middle and high school students on climate change (Colston & Vadjunec, 2015). Misconception-based learning is also being applied at tertiary level, with Bedford (2010) and Cook, Bedford, and Mandia (2014) describing classroom-based case studies in misconception-based learning. The case study described in Bedford (2010) had students in a university in northern Utah assess the veracity of Michael Crichton’s book State of Fear (Crichton, 2004). This fictional book features a group of ecoterrorists fabricating a series of disasters to be blamed on global warming, with Crichton seamlessly weaving misinformation that casts doubt on climate science into the book’s narrative. Students were instructed to engage with the arguments in the book and critically argue their own position. Another case study documented in Cook et al. (2014), based in a New York community college, involved a research paper assignment requiring students to refute a climate myth of their choosing, taken from, a website that refutes climate misinformation with peer-reviewed scientific research. Students were instructed to conform to the structure of an effective debunking according to psychological research outlined in Section 5, also summarized in The Debunking Handbook (Cook & Lewandowsky, 2011).

Last, a University of Queensland Massive Open Online Course ( MOOC), Making Sense of Climate Science Denial (Denial101x), implemented the approach of misconception-based learning, reaching over 21,000 students from over 160 countries (Cook et al., 2015b). MOOCs are particularly powerful tools as they allow educators to reach out to potentially hundreds of thousands of students, using interactive online systems and community-based forums to engage and educate students. The MOOC platform also allows comprehensive collection of data on student behavior and learning gains as they navigate through the course. These data enable instructors to identify strengths and weaknesses in online material, enabling iterative development increasing the efficacy of their courses.


Climate science denial and misinformation has a damaging impact on public perceptions of climate change and climate literacy levels, with a subsequent decreased support for mitigation policies. Consequently, it is important that scientists, communicators, and educators adopt an evidence-based response to science denial. Psychological research offers a number of guidelines in developing refutations that effectively reduce the influence of misinformation.

Nevertheless, there remain many challenges in further exploring the psychology of misinformation and refining practical interventions (see Cook, Ecker, & Lewandowsky, 2015a for an overview of anticipated future lines of research). Developing a better understanding of the confounding role of worldview in influencing climate attitudes and amplifying the impact of misinformation is one of the greatest challenges for researchers.

While a growing body of experimental evidence supports the efficacy of consensus messaging, scholarly debate over consensus messaging is expected to continue. One possible area of investigation is the effectiveness of combining consensus messaging with policy-related information or different mitigation-related technologies. Another area of investigation is the relative efficacy of consensus messaging versus other forms of scientific explanation (e.g., presentation of empirical evidence for AGW), or when paired with competing climate denial messages (e.g., McCright et al., 2016) and possible interactions between the various types of messaging.

A relatively neglected area of climate communication research is the impact of misinformation and ways to neutralize its influence. Machine learning techniques are now being used to analyze large bodies of data, gleaning insights into misinformation content and networks (see, e.g., Boussalis & Coan, 2016). Further investigation into practical refutation techniques is required, particularly testing the interaction of different climate messages delivered by a range of messengers to a variety of audiences. Initial research, built on decades of work on inoculation theory, have found that inoculation against climate misinformation is an effective intervention. Further investigation of this intervention type is worthy of future study, leading to the development of specific recommendations for communicators.

Similarly, while decades of research have established the efficacy of misconception-based learning, there is little empirical research into this teaching approach specific to climate change at either the secondary or tertiary level. Tippett (2010) laments the rarity of misconception-based learning material in textbooks. While such resources do exist in textbook form (Bedford & Cook, 2016) as well as online video resources (Cook et al., 2015b), practical application, empirical testing, and iterative development of such educational resources is required.

Psychological research indicates promising interventions in closing the consensus gap and reducing the influence of misinformation. Particularly effective are prebunkings, taking the form of inoculation against misinformation. A practical and powerful way to implement inoculation is misconception-based learning, which teaches scientific concepts by directly addressing and refuting misconceptions. Future research and practical application should further test and refine communication techniques.


Thanks to Ullrich Ecker and Stephan Lewandowsky for their helpful comments on this document.

Further Reading

  • Cook, J., & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
  • Marshall, G. (2014). Don’t even think about it: Why our brains are wired to ignore climate change. New York: Bloomsbury Publishing USA.
  • Oreskes, N., & Conway, E. M. (2011). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York: Bloomsbury Publishing USA.


  • Aklin, M., & Urpelainen, J. (2014). Perceptions of scientific dissent undermine public support for environmental policy. Environmental Science & Policy, 38, 173–177.
  • Anderegg, W. R. (2010). Moving beyond scientific agreement. Climatic Change, 101(3), 331–337.
  • Anderegg, W. R. L., & Goldsmith, G. R. (2014). Public interest in climate change over the past decade and the effects of the “climategate” media event. Environmental Research Letters, 9(5), 1–8, 054005.
  • Anderegg, W. R. L., Prall, J. W., Harold, J., & Schneider, S. H. (2010). Expert credibility in climate change. Proceedings of the National Academy of Sciences of the United States of America, 107, 12107–12109.
  • Bae, J., & Cameron, G. T. (2006). Conditioning effect of prior reputation on perception of corporate giving. Public Relations Review, 32(2), 144–150.
  • Banas, J. A., & Rains, S. A. (2010). A meta-analysis of research on inoculation theory. Communication Monographs, 77(3), 281–311.
  • Bedford, D. (2010). Agnotology as a teaching tool: Learning climate science by studying misinformation. Journal of Geography, 109(4), 159–165.
  • Bedford, D. (2015). Does climate literacy matter? A case study of US students’ level of concern about anthropogenic global warming. Journal of Geography, 115(4), 1–11.
  • Bedford, D. & Cook, J. (2016). Climate Change: Myths and Realities. Santa Barbara, CA: ABC-CLIO.
  • Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55.
  • Bolsen, T., & Druckman, J. N. (2015). Counteracting the Politicization of Science. Journal of Communication, 65(5), 745–769.
  • Bolsen, T., Leeper, T. J., & Shapiro, M. A. (2014). Doing what others do: Norms, science, and collective action on global warming. American Politics Research, 42(1), 65–89.
  • Boussalis, C., & Coan, T. G. (2016). Text-mining the signals of climate change doubt. Global Environmental Change, 36, 89–100.
  • Boykoff, M. T. (2007). Flogging a dead norm? Newspaper coverage of anthropogenic climate change in the United States and United Kingdom from 2003 to 2006. Area, 39(4), 470–481.
  • Boykoff, M. T. (2008). Lost in translation? United States television news coverage of anthropogenic climate change, 1995–2004. Climatic Change, 86(1), 1–11.
  • Boykoff, M. T., & Boykoff, J. M. (2004). Balance as bias: Global warming and the US prestige press. Global Environmental Change, 14(2), 125–136.
  • Boykoff, M. T., & Mansfield, M. (2008). “Ye Olde Hot Aire”: Reporting on human contributions to climate change in the UK tabloid press. Environmental Research Letters, 3, 1–8.
  • Brulle, R. J. (2014). Institutionalizing delay: Foundation funding and the creation of US climate change counter-movement organizations. Climatic Change, 122(4), 681–694.
  • Campbell, T. H., & Kay, A. C. (2014). Solution aversion: On the relation between ideology and motivated disbelief. Journal of Personality and Social Psychology, 107(5), 809.
  • Carlton, J. S., Perry-Hill, R., Huber, M., & Prokopy, L. S. (2015). The climate change consensus extends beyond climate scientists. Environmental Research Letters, 10(9), 1–12, 094025.
  • Chater, N., & Vitanyi, P. (2003). Simplicity: a unifying principle in cognitive science. Trends in Cognitive Science, 7, 19–22.
  • Clark, D., Ranney, M. A., & Felipe, J. (2013). Knowledge helps: Mechanistic information and numeric evidence as cognitive levers to overcome stasis and build public consensus on climate change. In Proceedings of the 35th Annual Meeting of the Cognitive Science Society (pp. 2070–2075). Austin, TX: Cognitive Science Society.
  • Colston, N. M., & Vadjunec, J. M. (2015). A critical political ecology of consensus: On “Teaching Both Sides” of climate change controversies. Geoforum, 65, 255–265.
  • Compton, J. (2013). Inoculation theory. In The SAGE handbook of persuasion: Developments in theory and practice (pp. 220–236). Thousand Oaks, CA: SAGE.
  • ComRes. (2014). ECIU climate change poll, August 2014. ComRes. Retrieved from
  • Cook, J. (2014). The quantum theory of climate denial. Huffington Post. Retrieved from
  • Cook, J., Bedford, D., & Mandia, S. (2014). Raising climate literacy through addressing misinformation: Case studies in agnotology-based learning. Journal of Geoscience Education, 62(3), 296–306.
  • Cook, J., Ecker, U., & Lewandowsky, S. (2015a). Misinformation and how to correct it. In R. Scott & S. Kosslyn (Eds.), Emerging trends in the social and behavioral sciences (pp. 1–17). Hoboken, NJ: John Wiley & Sons.
  • Cook, J., & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
  • Cook, J., & Lewandowsky, S. (2016). Rational irrationality: Modeling climate change belief polarization using Bayesian networks. Topics in Cognitive Science, 8(1), 160–179.
  • Cook, J., Nuccitelli, D., Green, S. A., Richardson, M., Winkler, B., Painting, R., et al. (2013). Quantifying the consensus on anthropogenic global warming in the scientific literature. Environmental Research Letters, 8(2), 1–7, 024024+.
  • Cook, J., Oreskes, N., Doran, P. T., Anderegg, W. R. L., Verheggen, B., Maibach, E. W., et al. (2016). Consensus on consensus: A synthesis of consensus estimates on human-caused global warming. Environmental Research Letters, 11(4), 1–7, 048002.
  • Cook, J., Schuennemann, K., Nuccitelli, D., Jacobs, P., Cowtan, K., Green, S., et al. (2015b, April). Denial101x: Making sense of climate science denial. edX. Retrieved from
  • Corner, A., Lewandowsky, S., Phillips, M., & Roberts, O. (2015). The uncertainty handbook. Bristol, U.K.: University of Bristol.
  • Crichton, M. (2004). State of fear. New York: HarperCollins.
  • Diethelm, P., & McKee, M. (2009). Denialism: What is it and how should scientists respond? European Journal of Public Health, 19, 2–4.
  • Ding, D., Maibach, E. W., Zhao, X., Roser-Renouf, C., & Leiserowitz, A. (2011). Support for climate policy and societal action are linked to perceptions about scientific agreement. Nature Climate Change, 1(9), 462–466.
  • Doran, P. T., & Zimmerman, M. K. (2009). Examining the scientific consensus on climate change. Eos, Transactions American Geophysical Union, 90(3), 22–23.
  • Ecker, U. K., Lewandowsky, S., Chang, E. P., & Pillai, R. (2014). The effects of subtle misinformation in news headlines. Journal of Experimental Psychology: Applied, 20(4), 323.
  • Ecker, U. K., Lewandowsky, S., Fenton, O., & Martin, K. (2014). Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Memory & Cognition, 42(2), 292–304.
  • Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570–578.
  • Ecker, U. K. H., Lewandowsky, S., & Tang, D. T. W. (2010). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38, 1087–1100.
  • Ecker, U. K., Lewandowsky, S., Chang, E. P., & Pillai, R. (2014). The effects of subtle misinformation in news headlines. Journal of Experimental Psychology: Applied, 20(4), 323.
  • Ecker, U. K., Lewandowsky, S., Fenton, O., & Martin, K. (2014). Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Memory & cognition, 42(2), 292–304.
  • Ecker, U. K., Swire, B., & Lewandowsky, S. (2014). Correcting misinformation—a challenge for education and cognitive science. In D. N. Rapp & J. L. G. Braasch (Eds.), Processing Inaccurate Information: Theoretical and Applied Perspectives from Cognitive Science and the Educational Sciences (pp. 13–38). Cambridge, MA: MIT Press.
  • Elsasser, S. W., & Dunlap, R. E. (2012). Leading voices in the denier choir: Conservative columnists’ dismissal of global warming and denigration of climate science. American Behavioral Scientist, 57(6), 754–776.
  • Farrell, J. (2015a). Corporate funding and ideological polarization about climate change. Proceedings of the National Academy of Sciences of the United States of America, 113(1), 92–97.
  • Feinberg, M., & Willer, R. (2013). The moral roots of environmental attitudes. Psychological Science, 24(1), 56–62.
  • Feldman, L., Maibach, E. W., Roser-Renouf, C., & Leiserowitz, A. (2011). Climate on cable: The nature and impact of global warming coverage on Fox News, CNN, and MSNBC. The International Journal of Press/Politics, 17(1), 3–31.
  • Feldman, L., Myers, T. A., Hmielowski, J. D., & Leiserowitz, A. (2014). The mutual reinforcement of media selectivity and effects: Testing the reinforcing spirals framework in the context of global warming. Journal of Communication, 64(4), 590–611.
  • Fernbach, P. M., Rogers, T., Fox, C. R., & Sloman S. A. (2013). Political extremism is supported by an illusion of understanding. Psychological Science, 24, 939–946.
  • Feygina, I., Jost, J. T., & Goldsmith, R. E. (2010). System justification, the denial of global warming, and the possibility of “system-sanctioned change. Personality and Social Psychology Bulletin, 36, 326–338.
  • Freudenburg, W. R., Gramling, R., & Davidson, D. J. (2008). Scientific certainty argumentation methods (SCAMs): Science and the politics of doubt. Sociological Inquiry, 78(1), 2–38.
  • Gifford, R. (2011). The dragons of inaction: Psychological barriers that limit climate change mitigation and adaptation. American Psychologist, 66(4), 290.
  • Guy, S., Kashima, Y., Walker, I., & O’Neill, S. (2014). Investigating the effects of knowledge and ideology on climate change beliefs. European Journal of Social Psychology, 44(5), 421–429.
  • Guzzetti, B. J., Snyder, T. E., Glass, G. V., & Gamas, W. S. (1993). Promoting conceptual change in science: A comparative meta-analysis of instructional interventions from reading education and science education. Reading Research Quarterly, 28(2), 117–159.
  • Hardisty, D. J., Johnson, E. J., & Weber, E. U. (2010). A dirty word or a dirty world? Attribute framing, political affiliation, and query theory. Psychological Science, 21, 86–92.
  • Hart, P. S., & Nisbet, E. C. (2011). Boomerang effects in science communication: How motivated reasoning and identity cues amplify opinion polarization about climate mitigation policies. Communication Research, 39, 701–723.
  • Heath, C., & Heath, D. (2007). Made to stick: Why some ideas survive and others die. New York: Random House.
  • Heath, Y., & Gifford, R. (2006). Free-market ideology and environmental degradation—the case of belief in global climate change. Environment and Behavior, 38, 48–71.
  • Houghton, J. T., Ding, Y., Griggs, D. J., Noguer, M., van der Linden, P. J., Dai, X., et al. (Eds.). (2001). Climate change 2001: The scientific basis—contribution of working group I to the third assessment report of the intergovernmental panel on climate change. Cambridge, U.K.: Cambridge University Press.
  • Houghton, J. T., Meira Filho, L. G., Callander, B. A., Harris, N., Kattenberg, A., & Maskell, K. (Eds.). (1996). Climate change 1995: The science of climate change—contribution of working group I to the second assessment report of the intergovernmental panel on climate change. Cambridge, U.K.: Cambridge University Press.
  • Hulme, M. (2013, July 25). What’s behind the battle of received wisdoms? (Web blog comment). Retrieved from
  • Hulme, M. (2015). (Still) disagreeing about climate change: Which way forward? Zygon, 50(4), 893–905.
  • Jacques, P. J., Dunlap, R. E., & Freeman, M. (2008). The organisation of denial: Conservative think tanks and environmental scepticism. Environmental Politics, 17, 349–385.
  • Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When misinformation in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory and Cognition, 20, 1420–1436.
  • Jou, J., & Foreman, J. (2007). Transfer of learning in avoiding false memory: The roles of warning, immediate feedback, and incentive. Quarterly Journal of Experimental Psychology, 60, 977–896.
  • Kahan, D., Jenkins-Smith, H., & Braman, D. (2011). Cultural cognition of scientific consensus. Journal of Risk Research, 14, 147–174.
  • Kahan, D. M. (2015). Climate‐science communication and the measurement problem. Political Psychology, 36(S1), 1–43.
  • Kahan, D. M. (2016). Will people who are culturally predisposed to reject human-caused climate change *believe* “97% consensus” social marketing campaign messages? Nope. The Cultural Cognition Project at Yale Law School. Retrieved from
  • Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., et al. (2012b). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732–735.
  • Kahneman, D. (2003). Maps of bounded rationality: Psychology for behavioral economics. American economic review, 93(5), 1449–1475.
  • Kohut, A., Keeter, S., Doherty, C., & Dimock, M. (2009). Scientific achievements less prominent than a decade ago: Public praises science; scientists fault public, media. Pew Research Center for People & the Press.
  • Kortenkamp, K. V., & Basten, B. (2015). Environmental science in the media: Effects of opposing viewpoints on risk and uncertainty perceptions. Science Communication, 37(3), 287–313.
  • Kotcher, J., Meyers, T., Maibach, E., & Leiserowitz, A. (2014, May). Correcting misperceptions about the scientific consensus on climate change: Exploring the role of providing an explanation for the erroneous belief. Communication and the Good Life. Paper presented at 2014 annual conference of the International Communication Association, Seattle, WA.
  • Kowalski, P., & Taylor, A. K. (2009). The effect of refuting misconceptions in the introductory psychology class. Teaching of Psychology, 36, 153–159.
  • Kudrna, J., Shore, M., & Wassenberg, D. (2015). Considering the role of “need for cognition” in students’ acceptance of climate change & evolution. The American Biology Teacher, 77(4), 250–257.
  • Kuhn, D., & Crowell, A. (2011). Dialogic argumentation as a vehicle for developing young adolescents’ thinking. Psychological Science, 22(4), 545–552.
  • Leiserowitz, A., Maibach, E., Roser-Renouf, C., Feinberg, G., & Rosenthal, S. (2015). Climate change in the american mind: October 2015. New Haven, CT: Yale Project on Climate Change Communication.
  • Lewandowsky, S. (2015). “Alice through the Looking Glass” mechanics: The rejection of (climate) science. openDemocracy. Retrieved from
  • Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13, 106–131.
  • Lewandowsky, S., Gignac, G. E., & Vaughan, S. (2013a). The pivotal role of perceived scientific consensus in acceptance of science. Nature Climate Change, 3(4), 399–404.
  • Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013b). NASA faked the moon landing—therefore, (climate) science is a hoax an anatomy of the motivated rejection of science. Psychological Science, 24(5), 622–633.
  • Lewandowsky, S., Stritzke, W. G. K., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction, and misinformation: The Iraq War 2003. Psychological Science, 16, 190–195.
  • Loewenstein, G. (1994). The psychology of curiosity: A review and reinterpretation. Psychological Bulletin, 116(1), 75.
  • Lombrozo, T. (2007). Simplicity and probability in causal explanation. Cognitive Psychology, 55, 232–257.
  • Luntz, F. (2002). The environment: A cleaner, safer, healthier America. Luntz Research, Alexandria. Retrieved from
  • Maibach, E., Leiserowitz, A., & Gould, R. (2013, December). A campaign to convey the scientific consensus about human-caused climate change: Rationale, formative research, and campaign overview. AGU Fall Meeting Abstracts, 1, 1.
  • Maibach, E., Myers, T., & Leiserowitz, A. (2014). Climate scientists need to set the record straight: There is a scientific consensus that human‐caused climate change is happening. Earth’s Future, 2(5), 295–298.
  • Malka, A., Krosnick, J. A., Debell, M., Pasek, J., & Schneider, D. (2009). Featuring skeptics in news media stories about global warming reduces public beliefs in the seriousness of global warming. Woods Institute for the Environment, Stanford University, Technical Paper. Retrieved from
  • Mason, L., Gava, M., & Boldrin, A. (2008). On warm conceptual change: The interplay of text, epistemological beliefs, and topic interest. Journal of Educational Psychology, 100(2), 291.
  • McCright, A. M., Charters, M., Dentzman, K., & Dietz, T. (2016). Examining the effectiveness of climate change frames in the face of a climate change denial counter-frame. Topics in Cognitive Science, 8(1), 76–97.
  • McCright, A. M., & Dunlap, R. E. (2000). Challenging global warming as a social problem: An analysis of the conservative movement’s counter-claims. Social Problems, 47(4), 499–522.
  • McCright, A. M., & Dunlap, R. E. (2011). The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010. The Sociological Quarterly, 52(2), 155–194.
  • McCright, A. M., Dunlap, R. E., & Xiao, C. (2013). Perceived scientific agreement and support for government action on climate change in the United States. Climatic Change, 119(2), 511–518.
  • McCuin, J. L., Hayhoe, K., & Hayhoe, D. (2014). Comparing the effects of traditional vs. misconceptions-based instruction on student understanding of the greenhouse effect. Journal of Geoscience Education, 62(3), 445–459.
  • McGuire, W. J., & Papageorgis, D. (1961). The relative efficacy of various types of prior belief-defense in producing immunity against persuasion. Public Opinion Quarterly, 26, 24–34.
  • McKnight, D. (2010). A change in the climate? The journalism of opinion at News Corporation. Journalism, 11(6), 693–706.
  • Miller, C. H., Ivanov, B., Sims, J., Compton, J., Harrison, K. J., Parker, K. A., et al. (2013). Boosting the potency of resistance: Combining the motivational forces of inoculation and psychological reactance. Human Communication Research, 39(1), 127–155.
  • Muller, D. A., Bewes, J., Sharma, M. D., & Reimann, P. (2007). Saying the wrong thing: Improving learning with multimedia by including misconceptions. Journal of Computer Assisted Learning, 24, 144–155.
  • Muller, D. A., Sharma, M. D., & Reimann, P. (2008). Raising cognitive load with linear multimedia to promote conceptual change. Wiley InterScience, 92(2), 278–296.
  • Nisbet, M. C. (2011). Climate shift: Clear vision for the next decade of public debate. American University, Washington, DC: School of Communication.
  • Nisbet, M. C. (2014). Engaging in Science Policy Controversies: Insights from the U.S. Debate Over Climate Change. In Handbook of public communication of science and technology (2d ed., pp. 173–185). London: Routledge.
  • Nisbet, M. C., Maibach, E., & Leiserowitz, A. (2011). Framing peak petroleum as a public health problem: Audience research and participatory engagement in the United States. American Journal of Public Health, 101(9), 1620–1626.
  • Nisbet, M. C., & Myers, T. (2007). The polls—trends twenty years of public opinion about global warming. Public Opinion Quarterly, 71(3), 444–470.
  • Nuccitelli, D., Way, R., Painting, R., Church, J., & Cook, J. (2012). Comment on ocean heat content and Earth’s radiation imbalance. II. Relation to climate shifts. Physics Letters A, 376(45), 3466–3468.
  • Nyhan, B., & Reifler, J. (2012). Misinformation and fact- checking: Research Findings from Social Science. Washington, DC: New America Foundation.
  • Oreskes, N. (2004). The scientific consensus on climate change. Science, 306(5702), 1686.
  • Oreskes, N. (2010). My facts are better than your facts: spreading good news about global warming. In M. S. Morgan & P. Howlett (Eds.), How do facts travel? (pp. 135–166). Cambridge, U.K.: Cambridge University Press.
  • Oreskes, N., & Conway, E. M. (2011). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York: Bloomsbury Publishing USA.
  • Osborne, J. (2010). Arguing to learn in science: The role of collaborative, critical discourse. Science, 328(5977), 463–466.
  • Painter, J., & Ashe, T. (2012). Cross-national comparison of the presence of climate scepticism in the print media in six countries, 2007–10. Environmental Research Letters, 7(4), 044005.
  • Pearce, W., Brown, B., Nerlich, B., & Koteyko, N. (2015). Communicating climate change: conduits, content, and consensus. Wiley Interdisciplinary Reviews: Climate Change, 6(6), 613–626.
  • Pelham, B. W. (2009). Awareness, opinions about global warming vary worldwide. Gallup. Retrieved from
  • Peterson, C. (1989, May 9). Experts, OMB spar on global warming: “Greenhouse Effect” may be accelerating, scientists tell hearing. Washington Post, A1.
  • Petty, R. E., & Wegener, D. T. (1999). The elaboration likelihood model: Current status and controversies. In S. Chaiken & Y. Trope (Eds.), Dual-process theories in social psychology (pp. 37–72). New York: Guilford Press.
  • Plutzer, E., McCaffrey, M., Hannah, A. L., Rosenau, J., Berbeco, M., & Reid, A. H. (2016). Climate confusion among U.S. teachers. Science, 351(6274), 664–665.
  • Poortinga, W., Spence, A., Whitmarsh, L., Capstick, S., & Pidgeon, N. F. (2011). Uncertain climate: An investigation into public scepticism about anthropogenic climate change. Global Environmental Change, 21(3), 1015–1024.
  • Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a Scientific Conception: Toward a Theory of Conceptual Change. Science Education, 66(2), 211–227.
  • Qin, D., Plattner, G. K., Tignor, M., Allen, S. K., Boschung, J., Nauels, A., et al. (2014). Climate change 2013: The physical science basis. T. Stocker (Ed.). Cambridge, U.K.: Cambridge University Press.
  • Rahmstorf, S. (2004). The climate sceptics. Potsdam Institute for Climate Impact Research. Retrieved from
  • Ranney, M. A., & Clark, D. (2016). Climate change conceptual change: Scientific information can transform attitudes. Topics in Cognitive Science, 8(1), 49–75.
  • Rolfe-Redding, J., Maibach, E. W., Feldman, L., & Leiserowitz, A. (2011). Republicans and climate change: An audience analysis of predictors for belief and policy preferences. Social Science Research Network. 2026002. Retrieved from
  • Rowson, J. (2013). A new agenda on climate change: Facing up to stealth denial and winding down on fossil fuels. London: The RSA.
  • Schul, Y. (1993). When warning succeeds: The effect of warning on success in ignoring invalid information. Journal of Experimental Social Psychology, 29, 42–62.
  • Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127–161.
  • Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? The Psychology of Learning and Motivation, 41, 265–292.
  • Shwed, U., & Bearman, P. S. (2010). The temporal structure of scientific consensus formation. American Sociological Review, 75(6), 817–840.
  • Solomon, S., Qin, D., Manning, M., Chen, Z., Marquis, M., Averyt, K. B., et al. (Eds.). (2007). Climate change 2007: The physical science basis: Working Group I contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (Vol. 4). Cambridge, U.K.: Cambridge University Press.
  • Stenhouse, N., Maibach, E., Cobb, S., Ban, R., Bleistein, A., Croft, P., et al. (2014). Meteorologists’ views about global warming: A survey of American Meteorological Society professional members. Bulletin of the American Meteorological Society, 95(7), 1029–1040.
  • Tippett, C. D. (2010). Refutation text in science education: A review of two decades of research. International Journal of Science and Mathematics Education, 8(6), 951–970.
  • Tranter, B. K., & Booth, K. I. (2015). Scepticism in a changing climate: A cross-national study. Global Environmental Change: Human and Policy Dimensions, 33, 154–164.
  • University of Maryland. (2009). Public attitudes toward climate change: Findings from a multi-country poll. World Public Opinion Poll. Retrieved from
  • Van der Linden, S., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2015a). The scientific consensus on climate change as a gateway belief: Experimental evidence. PLOS ONE, 10(2), e0118489.
  • Van der Linden, S., Leiserowitz, A., & Maibach, E. (2016). Communicating the scientific consensus on human-caused climate change is an effective and depolarizing public engagement strategy: Experimental evidence from a large national replication study. Social Science Research Network. Retrieved from
  • Van der Linden, S. L., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2014). How to communicate the scientific consensus on climate change: plain facts, pie charts or metaphors? Climatic Change, 126(1–2), 255–262.
  • Van der Linden, S. L., Clarke, C. E., & Maibach, E. W. (2015b). Highlighting consensus among medical scientists increases public support for vaccines: evidence from a randomized experiment. BMC Public Health, 15(1), 1–5.
  • Verheggen, B., Strengers, B., Cook, J., van Dorland, R., Vringer, K., Peters, J., et al. (2014). Scientists’ views about attribution of global warming. Environmental Science & Technology, 48(16), 8963–8971.