Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, CLIMATE SCIENCE (climatescience.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 22 October 2017

Strategies for Countering False Information and Beliefs about Climate Change

Summary and Keywords

Misperceptions about climate change are widespread, and efforts to correct them must be grounded in an understanding of the factors, both individual and social, that contribute to them. These factors can be organized into four broad categories: motivated reasoning, non-motivated information processing biases, social dynamics, and the information environment. Each type of factor is associated with a host of related strategies for countering false information and beliefs. Motivated biases can be reduced with affirmations, by attempting to depoliticize the issue, and via an evidentiary “tipping point.” Other cognitive biases highlight the importance of clarity, simplicity, and repetition. When correcting errors that contain an inaccurate causal explanation, it is also important to provide an alternative account of the event in question. Message presentation techniques can also facilitate updating beliefs. Beliefs have an important social dimension. Attending to these factors shows the importance of strategies that include: ensuring that lay people consistently have the tools that help them evaluate experts; promoting confidence among those who hold accurate beliefs; facilitating diverse, unsegregated social networks; and providing corrections from unexpected sources. Finally, the prevalence of misinformation in the information environment is highly problematic. Strategies that news organizations can employ include avoiding false balance, adjudicating among contradictory claims, and encouraging accuracy on the part of political elites via fact checking. New technologies may also prove an important tool: search engines that give preferential treatment to accurate information and automated recommendations of accurate information following exposure to inaccuracies both have the potential to change how individuals learn about climate change.

Keywords: misperceptions, beliefs, motivated reasoning, biased processing, metacognitive experiences, social networks, fact checking

Beliefs expressed by members of the public about climate change science frequently diverge from the conclusions reached by climate scientists (Funk & Rainie, 2015; Leiserowitz et al., 2014). The large numbers of citizens—especially Americans—who explicitly reject the consensus scientific view on climate change poses a serious challenge for scientists and policymakers striving to constrain its global consequences and human costs (Nisbet, 2014). Without a common understanding of the causes of climate change and the threat it poses, agreeing on a coordinated response to this global phenomenon has proven to be nearly impossible.

This article provides an overview of communication strategies for countering public misperceptions about climate change.1 Successfully correcting false information in this context means that individuals come to understand and/or accept the consensus views of climate scientists on the subject of climate change. This can take different forms, such as acknowledging that climate change is real and that human activity contributes to it, or adopting more accurate perceptions of the risks posed by a changing climate. There is no single solution to the problem of climate science misperceptions: inaccurate beliefs have numerous sources, and an effective response requires a correspondingly diverse mix of corrective strategies. For purposes of this review, individuals’ reasons for adopting science-inconsistent beliefs are divided into four broad categories: motivated reasoning, non-motivated information processing biases, social dynamics, and the information environment. Each of these categories is associated with a battery of strategies that can promote belief accuracy.

Motivated Reasoning

The first explanation for persistent, high-profile climate change misperceptions is individuals’ propensity to adopt attitude-congruent beliefs and to defend them against novel evidence and arguments using strategies that include counterargument (Lord, Ross, & Lepper, 1979), reinterpretation of evidence (Gaines, Kuklinski, Quirk, Peyton, & Verkuilen, 2007), source derogation (Byrne & Hart, 2009), priming of other belief-affirming cognitions (Taber & Lodge, 2006), negative affective responses (Lodge & Taber, 2013), and inferring the existence of other, unknown evidence (Prasad et al., 2009). This tendency, collectively referred to as motivated reasoning (Kunda, 1990), has been attributed to several distinct mechanisms.

Some scholars assert that beliefs are a valued source of self-identity and that resisting belief change is fundamentally ego-defensive (Cohen, Aronson, & Steele, 2000; see also Steele, 1988). When considering whether to update a belief based on new evidence, individuals must consider whether the potential change poses a self-identity threat. On this view, values and moral evaluations are expected to be powerful predictors of beliefs (Feinberg & Willer, 2013; Leiserowitz, Maibach, Roser-Renouf, Smith, & Dawson, 2013; Liu & Ditto, 2013). If the new belief calls into question an individual’s positive sense of self, the individual will be strongly motivated to resist it. For example, individuals will resist acknowledging the scientific consensus about the evidence for climate change if doing so would create a sense of incompetence or attitude instability. Similarly, accepting that fossil fuel consumption harms the planet will be difficult if causing such harm is seen as incompatible with being a good person. When the ego is threatened by new information, the individual will employ a wide variety of strategies to justify rejecting that information. Individuals will challenge belief-disconfirming evidence vigorously, finding fault at every turn, while unthinkingly accepting belief-affirming evidence (Ditto & Lopez, 1992; Lord et al., 1979; Munro et al., 2002; Prasad et al., 2009).

Reactance (Brehm & Brehm, 1981) is compatible with the ego protection–based explanation of motivated reasoning (Steele, 1988). Individuals are motivated to see themselves as capable of free choice, and messages that threaten this capability are met with counterargument and anger (Rains, 2013). Evidence of a changing climate limits free choice because it implies that denial is not a legitimate option, and rather than accept this constraint, individuals work to reject the belief-inconsistent evidence. Reactance-induced message rejection strategies, such as counterarguing and source derogation, can even contribute to a boomerang or backfire effect, whereby individuals embrace the original, inaccurate belief more strongly than they did prior to message exposure (e.g., Byrne & Hart, 2009; Nyhan & Reifler, 2010; Nyhan, Reifler, & Ubel, 2013).2 Such backfire effects are not, however, inevitable: in many instances, corrections work equally well regardless of an individuals’ prior attitudes (Ecker, Lewandowsky, Fenton, & Martin, 2014).

Identity threats are also an important element of Kahan’s cultural cognition thesis, which asserts that “individuals are psychologically disposed to believe that behavior they (and their peers) find honorable is socially beneficial and behavior they find base [is] socially detrimental” (Kahan, Jenkins-Smith, & Braman, 2011, p. 148). For Kahan and his colleagues the threat is uniquely focused on social standing, not on self-identity. Individuals are motivated to believe claims that reinforce their important interpersonal connections (Bliuc et al., 2015; Kahan, 2010, p. 296), because these relationships have direct influence on their material and psychological welfare (Kahan, 2013, p. 409). Simply put, being ostracized from one’s social network is more costly than being wrong about climate change.

In contrast to identity-based explanations of motivated reasoning, political scientists Lodge and Taber (2013) put the emphasis on automaticity and affect. They argue that information processing is driven by instantaneous, unconscious responses, including emotions, stereotypes, and attractions. These automatic responses function as heuristics, shortcuts that fundamentally shape conscious thought, coloring evidence and arguments that come to mind and turning “reason” into rationalization. The more carefully one thinks about an issue, investing time and energy in the thought process, the more influential these biases tend to become. As a consequence, individuals who are predisposed to engage in systematic, effortful information processing are also more likely to exhibit bias (Kahan, 2013). Belief-consistent and identity-affirming cognitions typically elicit positive reactions, which then lead individuals to process new information in ways that favor their pre-existing attitudinal disposition, including their climate change beliefs.3

Whether it is explained in terms of ego protection, identity threats, or automaticity, the outcome of motivated reasoning is the same. Conservatives who are more familiar with the evidence related to the climate change debate, who feel more strongly about the issue, and who more actively deliberate about it are more likely to express views that are inconsistent with the scientific evidence than their less engaged counterparts (e.g., Joslyn & Haider-Markel, 2014; Kahan, 2013).4 Indeed, the mere mention of climate change can reduce individuals’ trust in science (Nisbet, Cooper, & Garrett, 2015).

Correction Strategies for Motivated Reasoning

There are a variety of approaches that may help counter false beliefs despite the fact that belief change can threaten individuals’ self-identity, their social values, and/or their social standing. The first of these is to precede corrective messages with identity affirmation, buffering the individual against the threat to self by promoting self-worth in another domain. Affirming the self has been shown to facilitate acceptance of threatening health messages (Cohen et al., 2000; Cohen & Sherman, 2014; Harris & Napper, 2005; Sherman, Nelson, & Steele, 2000), and one study suggests that affirmations can improve accuracy of climate change beliefs among Republicans (Nyhan & Reifler, 2011).5 Most of these tests, however, were conducted in lab settings, inducing affirmation by asking participants to recall and describe personal experiences that illustrate how their actions embody their most valued self-attributes.

This manipulation is likely to be too contrived to work outside the lab. Instead, it may be more fruitful to present accurate information in naturally affirming contexts, even if those contexts have nothing to do with climate science. Individuals whose recent accomplishments lead them to perceive themselves as good, competent, influential, etc. should be better able to examine belief-disconfirming information in an unbiased manner. A related strategy entails highlighting value-affirming implications of accurate beliefs. For example, individuals who deny that climate change is real are more accepting of the science when the consequences are framed as opportunities for private enterprise to succeed (e.g., Bain, Hornsey, Bongiorno, & Jeffries, 2012; Feygina, Jost, & Goldsmith, 2010) and when pollution is described as a threat to purity and cleanliness—characteristics that tend to be valued more by conservatives than liberals (Graham, Haidt, & Nosek, 2009)6—than in terms of the harms it causes (Feinberg & Willer, 2013).

The next two strategies stem from work on the cultural cognition thesis. The first of these is to disentangle knowledge from identity. When scientific evidence about climate change is understood as an implicit challenge to the competence of an individual’s social group, the individual has a powerful incentive to reject it. Separating climate change knowledge from social identity is no small task given the extreme ideological polarization evident in the climate change debate today (Leiserowitz et al., 2014). In Kahan’s words, “It would be glib to say ‘that’s all communicators have to do’ to dispel polarization over climate science. It is what they have to do. But how to do this is far from obvious” (2015, p. 30). He recommends that science communicators, educators, and public opinion scholars distinguish between (a) beliefs about climate change and (b) knowledge of what scientists say or what the evidence shows.7 Kahan characterizes this approach as separating who individuals are from what they know. The few studies that make such a distinction find that knowledge of experts’ empirical claims tends not to vary by ideology the way that beliefs do (Garrett, Weeks, & Neo, 2016; Kahan, 2015). This is consistent with findings in non-science contexts, which suggests that partisans differ most in their interpretations of information, not in what information they hold (Gaines et al., 2007). In practice, this might require reconceptualizing the goals of science communication. Rather than trying to persuade those who deny that humans are responsible for climate change, for instance, perhaps the objective should be to raise awareness of scientific evidence and scientists’ conclusions. It may be sufficient for a strong conservative to accept that scientists have concluded that climate change is a product of human activity even if he or she refuses to believe this conclusion. There is, however, a risk that allowing differences in belief to persist could be an obstacle to creating an effective policy response (see Nisbet, 2014). The second approach linked with the cultural cognition thesis is to focus on countering misperceptions among groups for which climate change beliefs are less politically charged, such as U.S. racial and ethnic minorities (Pearson & Schuldt, 2015). Science communicators may find that educational messaging is substantially more effective when the issue is less tied to social identity.

The next strategy derives from the observations that even motivated reasoners have a “tipping point” (Redlawsk, Civettini, & Emmerson, 2010). There is evidence that, for at least some issues, individuals will update their beliefs in the face of sufficient evidence. In the context of climate change, the scientific consensus on the subject may be enough to push many individuals over this tipping point (Anderegg, Prall, Harold, & Schneider, 2010; Benestad et al., 2015; Cook et al., 2013; Doran & Zimmerman, 2009; Myers, Maibach, Peters, & Leiserowitz, 2015).8 In one study, informing individuals that there is 97% agreement among climate scientists regarding anthropogenic climate change promoted more accurate beliefs and neutralized the effect of worldview on belief accuracy (Lewandowsky, Gignac, & Vaughan, 2013). Some have gone so far as to label the acceptance of scientific consensus a “gateway belief,” because of its powerful influence on beliefs about human-induced climate change and on subsequent support for mitigation policies (Cook, 2016; van der Linden, Leiserowitz, Feinberg, & Maibach, 2015).

It is, however, possible that the effectiveness of consensus messaging may be limited to lab-based experiments. Polling data indicate that public views on climate change have not changed significantly despite the numerous consensus-focused messaging campaigns conducted over the past several years (Kahan, 2015, p. 16). This suggests that consensus messaging alone is insufficient. In Redlawsk and colleagues’ (2010) work on the tipping point, the amount of incongruent information was critically important, not just the strength of any one piece of evidence. This suggests that a messaging campaign presenting a more diverse set of accurate information could be more effective and might prove a useful complement to consensus messages.

Some of the lessons drawn from research on motivated reasoning concern strategies that science communicators should avoid. Lodge and Taber (2013) caution against using messages that implore information consumers to be deliberative, to think carefully and thoroughly. The problem lies not in individuals’ resistance to such advice but in how they act on it. Deliberative individuals tend to spend more time weighing options, and this often translates into more extensive motivated reasoning (see also Kahan, 2013). The more thorough an individual’s thought process, the more opportunities there are for unconscious and automatic biases to influence judgment. Thus, strategies focused on encouraging those who reject climate change to engage in careful thought and deliberation are unlikely to work, especially among those with the strongest political identities.

The depoliticization strategy suggested by the cultural cognition thesis is a stark contrast to the approach advocated by Mooney (2012), which appears to be grounded in significant part on his assertion that “[l]iberals are better at getting at the truth in complex, nuanced situations” (p. 267) than conservatives. The claim that liberals are cognitively superior goes beyond available data. There is growing evidence that ideological differences belie different cognitive styles (e.g., Jost & Amodio, 2012) and different information-seeking strategies (e.g., Garrett & Stroud, 2014; Shook & Fazio, 2009). There is even evidence consistent with the idea that conservatives may be more prone to some types of bias than liberals (e.g., Hibbing, Smith, & Alford, 2014; Iyengar, Hahn, Krosnick, & Walker, 2008; Nam, Jost, & Van Bavel, 2013). There is, however, no strong direct evidence that conservatives are uniquely prone to engage in motivated reasoning, even among studies specifically designed to test this proposition (Kahan, 2013; Mooney, 2012; Nisbet et al., 2015). Furthermore, although some of Mooney’s recommendations follow from, or are consistent with, those described here—especially his emphasis on creating more coherent, less ambiguous messaging on contentious topics—his underlying assumptions about innate ideologically driven cognitive differences has already promoted anger among conservatives (Mooney, 2012, p. 262), which may ultimately promote more motivated reasoning (Weeks, 2015). There is considerable risk that it will be hard to persuade Republicans to trust science communicators and update their beliefs based on science communication if those same science communicators label Republicans as incapable of reasoned scientific thought (for related arguments, see Hibbing, Smith, & Alford, 2013).

Conspiracist Ideation

Motivated reasoning also has consequences for climate change conspiracy theory beliefs. It is not uncommon for climate science critics to accuse scientists of participating in a sweeping conspiracy (e.g., Koteyko, Jaspal, & Nerlich, 2013). Furthermore, individuals who are more receptive to conspiracist ideation in other domains (e.g., NASA faked the moon landing) are also more likely to be skeptical of climate scientists’ conclusions (Lewandowsky, Oberauer, & Gignac, 2013). There is a growing body of evidence suggesting that conspiracy theory beliefs are a product of motivated reasoning, including the notable observation that these beliefs often increase with knowledge (Lewandowsky, Gignac, & Oberauer, 2013; Miller, Saunders, & Farhart, 2015) and with attitude strength (Pasek, Stark, Krosnick, & Tompson, 2015). Motivated reasoning, however, is only part of the explanation. For example, conspiracy ideation also increases when individuals lack control over a situation (Whitson & Galinsky, 2008). There is relatively little work examining how best to respond to conspiracy theories specifically, but to the extent that they operate like other expressions of motivated reasoning, the strategies outlined in the section “Correction Strategies for Motivated Reasoning” could be fruitful.

One alternative approach, which should be avoided, also bears mentioning. Sunstein and Vermeule promote what they call “cognitive infiltration” (2009, p. 224). The authors assert that conspiracy theories prosper because those who subscribe to such beliefs have shielded themselves from other viewpoints (i.e., conspiracists suffer from a knowledge deficit). The solution, in the authors’ view, is for government agents to join groups that support conspiracy theories, perhaps covertly, in order to raise doubts among conspiracists. To the extent that motivated reasoning is contributing to conspiracist ideation, however, an approach grounded in interpersonal conflict, and potentially deception, is more likely to solidify false beliefs than to correct them.

Non-Motivated Information-Processing Biases

Motivated reasoning is premised on the idea that updating attitudes and beliefs can be costly and in these instances individuals will often prefer to adopt an attitude-defensive strategy. For many people, however, beliefs about politically charged issues, including climate change, have relatively little consequence for their sense of self or their social identity. Politics just are not that important to most people (Delli Carpini & Keeter, 1996; Prior, 2007). There is good evidence that this holds for climate change beliefs, too. One in 7 Americans say they “don’t know” if global warming is happening; 3 in 10 say they could easily change their mind on the topic, and 4 in 10 say they need “some” or “a lot” more information to form an opinion about it (Leiserowitz et al., 2014). Among these less engaged citizens, motivated reasoning is expected to be considerably less important.

Individuals who are only modestly invested in holding a specific climate change belief are still prone to other types of processing biases. And those who do engage in motivated reasoning are simultaneously susceptible to these other sources of error, though the effects are likely to be comparatively small. The information environment is ambiguous and complex, and there are many types of information problems that people are not very good at solving (Kahneman, 2011). As a consequence, people must often rely on intuition and cognitive heuristics, mental shortcuts that help them to reach adequate, if not optimal, solutions with reasonable speed and modest effort (Kuklinski & Quirk, 2000). Processing biases unrelated to identity threats work differently than motivated reasoning and therefore need to be countered differently. A review of these biases suggests a number of distinct corrective strategies.

Metacognition: Accessibility and Fluency

One important source of bias is the metacognitive experiences that accompany individuals’ information processing (Schwarz, 2012). For example, thoughts pertinent to a judgment task may come to mind easily or with difficulty; similarly, consideration of the information retrieved may be effortless or demanding. These metacognitive experiences moderate the influence of thought content on judgment: one metacognitive experience may lead to beliefs that are consistent with the information presented or the thoughts generated, while another might lead to an opposing conclusion (Schwarz, Sanna, Skurnik, & Yoon, 2007).

Consider accessibility. The more easily a thought comes to mind, the more accessible it is, and accessibility typically denotes fidelity between recall and the external world (Schwarz et al., 2007; Wänke, 2012). For example: Did you lock the door when you last left home? The more easily you can answer that question, the more confident in your response you are likely to be. This suggests one of the ways in which a public information campaign can be influential. Repeated exposure promotes accessibility, which can lead individuals to be more confident that the information is correct. The inverse is also true: when information is more difficult to recall, individuals are more likely to question its correspondence to the world (Schwarz et al., 2007, p. 135). This has intriguing implications. For example, the more difficulty an individual has generating arguments in support of a favored belief, the less confident the individual will be about that belief. This suggests that challenging individuals who are only modestly skeptical about climate change to list numerous critiques of the science, enough that the task is difficult, should undermine their confidence in their position.

Accessibility is also related to the availability heuristic, which asserts that an event that comes to mind easily is typically assumed to be a more common occurrence than one that is hard to remember (Tversky & Kahneman, 1973). In the case of the climate change debate, this could contribute to erroneous perceptions of public opinion: repeated exposure to even a single source that challenges climate change science can lead recipients to feel that the belief is widely held (Aklin & Urpelainen, 2014). This could also work in science communicators’ favor, though. Individuals who repeatedly encounter accurate messaging about climate change will tend to assume (accurately) higher levels of popular support for the idea.

Fluency refers to the ease with which something is perceived and understood. Like accessibility, fluency affects judgments of truth (Schwarz & Clore, 2007), a phenomenon referred to as the illusory truth effect (DiFonzo, Beckstead, Stupak, & Walders, 2016). There is sense to this heuristic: accurate ideas about the natural world tend to be reinforced more often, which makes them easier to understand than inaccurate ideas. But fluency can also be misleading. Individuals tend to be unaware of the range of factors that influence the experience of fluency, from repetition to the style of presentation (Schwarz & Clore, 2007), and these factors can directly shape beliefs despite having no substantive bearing on their accuracy. Fluency can also influence credibility judgments. For example, using simple language to convey an idea promotes the appearance of source competence (Oppenheimer, 2006).

These observations have several implications for correcting false beliefs about climate change. Messages that use clear, simple presentations tend to be persuasive (Myers et al., 2015; van der Linden, Leiserowitz, Feinberg, & Maibach, 2014), and easily understood visual presentations of evidence may also help (Nyhan & Reifler, 2011; van der Linden et al., 2014). Visuals must be handled carefully, though: photographs having no bearing on a belief have been shown to promote misperceptions in some studies (e.g., Garrett, Nisbet, & Lynch, 2013; Nyhan, Reifler, Richey, & Freed, 2014). The illusory truth effect also means that it is important to avoid repeating false information when making corrections. Rather than repeating an inaccurate statement (e.g., “there’s no evidence that warming has stopped”), science communicators should simply state the truth (e.g., “warming continues”). Repeating a false belief while declaring it to be false is unlikely to be successful and will often backfire (Berinsky, 2015; Schwarz & Clore, 2007).

Continued Influence Effects

Another striking aspect of beliefs is that they can survive outright rejection of the evidence from which they were derived. This is commonly known as the continued influence effect (Johnson & Seifert, 1994; Seifert, 2002), though it shares important similarities with work on belief perseverance (Anderson, 1983, 2007). Individuals rapidly integrate novel information into explanatory frameworks that help to make sense of their environment. These causal explanations often persevere even after an individual accepts that the information on which the belief was formed is false. A key reason appears to be the retained belief’s explanatory power. Lacking another, better alternative for making sense of the evidence at hand, individuals hold to the groundless belief. The solution is straightforward. Although individuals are unlikely to give up a plausible explanation without an alternative, an alternative explanation that fills the “causal gap” without relying on the inaccurate information will often be accepted (Seifert, 2002). In the climate change context, this suggests that critiquing the evidence on which climate change doubts are based will be ineffective unless an equally compelling alternative explanation for why the doubts exist is provided.

Although the explanatory power of misinformation appears to be fundamental to its influence on individuals who have consciously rejected it, there are other factors that moderate this effect. First, preexposure warnings that an information environment includes inaccurate claims can reduce the continued influence effect, making individuals more responsive to subsequent retractions (Ecker, Lewandowsky, & Tang, 2010). Second, individuals are less likely to use retracted information in subsequent decision-making when the retraction was presented repeatedly and when their cognitive load at the time the retraction was presented is low (Ecker, Lewandowsky, Swire, & Chang, 2011).

Message Presentation

How climate change messages are framed also influences their reception. Messages that focus on the benefits of responding to climate change are associated with higher perceived severity of climate change impacts than messages that highlight the costs of inaction (Spence & Pidgeon, 2010). A similar effect is observed for messages that describe the impacts of climate change as distant versus near (Spence & Pidgeon, 2010). Story frames, which present information using narrative structures that feature a setting, plot, characters, and moral, are also influential (Jones, 2014). Individuals respond to climate change information presented in a story frame differently than they respond to a list of facts. Those reading narratives are more likely to express positive affect toward stakeholders portrayed as “heroes,” and these heroes’ attitudes shape climate change risk perceptions.9 However, the influence of story exposure on how individuals’ think about climate change concepts—more specifically, the cognitive clustering of these ideas—is dependent on the individuals’ cultural orientation. Individuals are only influenced by culturally congruent stories, they respond to stories that are at odds with their cultural values much as they would to a list of facts (Jones & Song, 2014).

We also know that there are a variety of individual-level differences that influence which messages attract the most attention, and this could have implications for beliefs, especially among low-knowledge individuals. For instance, messages presenting numeric information about a contentious science issue garner more attention among high-numeracy individuals, while exemplars that describe the issue from the perspective of an individual tend to hold the attention of high-empathy individuals longer (Knobloch-Westerwick, Johnson, Silver, & Westerwick, 2015). Discrete emotions can also play an important role in shaping climate change perceptions (Smith & Leiserowitz, 2014). For instance, anxiety increases individuals’ willingness to accept novel information regardless of its political implications (or its accuracy), while anger can promote motivated evaluation (Weeks, 2015). There is some evidence that the effect of anxiety on climate change beliefs is limited to individuals on the political left (Nai, Schemeil, & Marie, 2016).

Social Dynamics

Up to this point, we have focused on attributes of the individual, including motivated biases and flawed heuristics. Now we consider the social dimensions of belief. Rumors, which are beliefs lacking a secure standard of evidence and which help to make sense of ambiguous and/or threatening situations (DiFonzo & Bordia, 2007), have long been understood to be fundamentally social, emerging through interactions between individuals and within communities (Shibutani, 1966). These social dynamics are not limited to rumoring. Indeed, the notion that human knowledge is a collective good is the premise of the interdisciplinary field of social epistemology (Goldman & Whitcomb, 2011).

The literature on social epistemology offers a variety of insights. Consider the observation that individuals must rely on social evidence when forming individual beliefs. Experts are, by definition, individuals possessing both a substantial body of knowledge about a target domain (in this case, climate science) and the capacity to deploy this knowledge to form accurate beliefs on new questions in that domain.

Novices lack these competencies and must therefore rely on experts when forming judgments on the topic. An important consequence of novices’ lack of relevant expertise, however, is that they are not in a position to evaluate experts using their own opinion. Instead, novices must rely on a host of other indicators to assess the credibility of experts’ claims (Goldman, 2011). This is a particularly pressing problem when, as is the case with climate change, different “experts” offer contradictory claims. There are five types of cues that novices can use to decide which experts to trust: (1) the arguments made by competing experts for their position and critiquing the opposition, (2) scientific consensus, (3) appraisals of the experts’ expertise as evidenced, for example, by credentials, (4) indications of experts’ biases, and (5) experts’ past performance (see Goldman, 2011, p. 116). Although these criteria may appear self-evident, relevant cues are not always available to novices. A social expectation that information used to evaluate expertise be offered consistently and that its absence signals that claims should be viewed with suspicion could help novices sort through the complex and contradictory claims made about climate science. The third cue—consideration of experts’ bias—highlights another strategy that has been shown to be useful for countering misperceptions. For reasons already discussed, individuals are almost universally suspicious of belief-contrary claims, and source derogation is a common defensive mechanism. Corrections that come from trusted and/or unanticipated sources, such as a Republican politician or a former climate change denier, can be uniquely persuasive (Berinsky, 2015).

Non-experts can also be information resources, and knowledgeable peers can play an important role in countering falsehoods within their social network (Southwell, 2013). The flow and termination of false rumors on Facebook illustrates the power of peers (Friggeri, Adamic, Eckles, & Cheng, 2014). Analysis of a massive collection of observational data has shown that claims debunked by the fact-checking site Snopes are uniquely likely to receive comments linking to that site and that those comments tend to be made quickly, typically within 10 minutes of the false post. Furthermore, the inaccurate initial posts are more likely to be deleted and are less likely to be reshared than accurate posts. This suggests that individuals do police their information environment and actively work to correct errors and stem the flow of misinformation, at least on Facebook. It may therefore be useful to augment knowledgeable individuals’ influence within their network. Messaging designed to boost the confidence of those who hold accurate information (e.g., the Snopes website) can help by making it more likely that these individuals will share what they know and believe (Southwell, 2013). In contrast to the patterns observed on Facebook, however, political rumors on Twitter in 2012 were uniquely likely to be shared, were rarely challenged, and were in most cases unaffected by fact-checking efforts (Shin, Jian, Driscoll, & Bar, 2016). Thus, it appears that the flow of misinformation and corrections is sensitive to both network and topic.

The topology of the networks that connect us to our peers is also important. Beliefs of those with whom we interact directly are uniquely influential, especially when those individuals belong to our in-group and when our social networks are highly clustered, with relationships arranged to form small “cliques” (DiFonzo et al., 2014). Thus, the flow of accurate information can also be promoted by developing more diverse, less segregated communication networks. This principle can, however, be carried too far. Although connections among individuals with diverse views can reduce belief polarization, messages from strangers are unlikely to be trusted and will often be seen as offensive (Hannak, Margolin, Keegan, & Weber, 2014), producing emotions that can lead false beliefs to become more entrenched.

Another important insight derived from network scholarship is that individuals with exceptionally high numbers of social ties, often labeled “influentials,” do not necessarily have substantially greater sway over their followers’ judgment (Borge-Holthoefer & Moreno, 2012); instead, their influence derives largely from the size of their network (Watts & Dodds, 2007). The implication of this observation may be surprising. Rather than targeting “opinion leaders” in the climate change denial movement, it may be more effective to target the many individuals whose beliefs are held more weakly (for a contrasting view, see Nisbet & Kotcher, 2009). A critical mass of accurate beliefs may ultimately be more influential than a handful of high-profile conversions (see Watts & Dodds, 2007). This does not, however, mean that network position is irrelevant: the more central individuals are in their network, the more widely their accurate views can travel (Budak, Agrawal, & Abbadi, 2011).

Information Environment

A theme cutting across all of these sources of misperception concerns the information environment. The more inaccurate information circulating, the harder it is for individuals to reach accurate conclusions.

There are many who are committed to promoting an accurate understanding of climate science, but there are also those who have a significant interest in manufacturing doubt about the science in hopes of preventing, or at least postponing, climate change mitigation strategies (Lewandowsky, Gignac, & Vaughan, 2013; Oreskes & Conway, 2010). Countering false climate change information also requires engagement with information producers.

One important set of strategies involves the news media. When reporting about climate change, news organizations have historically adhered to norms of balanced reporting, presenting the two sides of the climate change debate as holding equally legitimate positions (Boykoff & Boykoff, 2004; Boykoff & Roberts, 2007). Neutral reporting of contradictory factual claims can be harmful in a number of ways. Most directly, it can reduce public understanding of science (Dixon & Clarke, 2013; Malka, Krosnick, Debell, Pasek, & Schneider, 2009). It can also reduce individuals’ confidence in their ability to determine the truth about politically charged issues (Pingree, Brossard, & McLeod, 2014), which has the potential to make them more vulnerable to future misinformation. An obvious counter to this problem is to avoid false balance (Dixon & Clarke, 2013) and to encourage journalist to adjudicate when stakeholders, policymakers, or experts make contradictory claims (Pingree et al., 2014).

Partisan media are another crucially important part of this story (see “The Effects of Network and Cable TV News Viewing on Climate Change Opinion, Knowledge, and Behavior”). Consuming conservative news media reduces belief certainty about climate change, while nonconservative outlet exposure promotes belief certainty, and this process is self-reinforcing (Feldman, Myers, Hmielowski, & Leiserowitz, 2014): individuals tend to rely more on outlets that affirm their beliefs than on those that do not. Partisan media’s effects are partially mediated by trust in scientists. Conservative media decreases trust in scientists, while nonconservative media increases it (Hmielowski, Feldman, Myers, Leiserowitz, & Maibach, 2014). This is likely a consequence of differences in how outlets’ political slant influences their coverage of climate change news (Feldman, Hart, & Milosevic, 2015) and the opinions expressed about it (Feldman, 2011).

Political elites are a third potentially important source of misperceptions about climate change. Citizens often align their beliefs with the views expressed by high-profile politicians (Darmofal, 2005; Watts, Domke, Shah, & Fan, 1999), and climate change beliefs are no exception (Brulle, Carmichael, & Jenkins, 2012). Fortunately, there is also evidence that political elites adjust their behavior in response to fact checking and the threat of being caught endorsing misinformation (Nyhan & Reifler, 2015). Thus, it may be possible to use science-based fact checking of political elites as a deterrent to the circulation of climate change misinformation. A more extreme option is to attempt to apply a legal penalty to those who knowingly share or promote inaccurate claims (Nyhan, 2010), but the bar for proving intentional deceit must be high lest it have a chilling effect on free speech (Sunstein, 2009).

Finally, non-elites are sometimes responsible for the dissemination of misinformation. Given that individual’s beliefs are often informed by the beliefs of their peers, some may choose to misrepresent their beliefs out of self-interest. Statements of belief about contested political facts may not always be honest disclosures of judgment; instead, they may be strategic communication intended to advance an image of an allied group (Bullock, Gerber, Hill, & Huber, 2015; Prior, Sood, & Khanna, 2015). In other words, people may not always believe what they say, instead giving answers that are flattering to their in-group regardless of the evidence. Experiments have shown that even modest accuracy incentives can significantly reduce partisan difference in stated beliefs (Prior et al., 2015), due in part to an increase in individuals’ willingness to admit ignorance on the topic (Bullock et al., 2015). A critical limitation of these studies, however, is the presumption that accuracy incentives induce more honest belief disclosures and that they do not induce a social desirability bias or encourage dishonest responses expected to elicit the reward.

Using New Technology to Confront Misinformation

A number of recent efforts have been made to use information and communication technologies to reduce the flow of false information, including misrepresentations of science. On the largest scale, Google’s Knowledge Vault seeks to extract and validate information found on the web in order to produce a vast repository of machine-readable facts (Dong et al., 2014; Luna Dong et al., 2015). The resource has been used to compute website-trust scores—sites do better the more accurate, and less inaccurate, information they contain. This information could eventually become an integral part of the ranking algorithm used by the company’s search service (Hodson, 2015; Vydiswaran, Zhai, & Roth, 2011), which could have significant influence on the information that individuals encounter given the company’s vast reach.

On the subject of climate science specifically, the developers of a web-annotation tool called Hypothes.is have partnered with scientists to create the Climate Feedback Project. The annotation service allows users to create and/or view a layer of commentary laid atop existing web content, including news stories. The objective of the project is to promote public understanding of climate science by having multiple experts “peer review” real-world climate change news, providing readers with detailed assessments of the coverage as well as an overall credibility rating (Climate Feedback, 2016). Adding a layer of fact checking to existing web content is not new (see, e.g., Ennals, Byler, Agosta, & Rosario, 2010; Ennals, Trushkowsky, & Agosta, 2010), but the involvement of domain experts in the evaluation process is novel. The effectiveness of the system for promoting accurate beliefs is not yet known; however, tests of the approach in other topic domains raise concern. Corrections embedded in an inaccurate message about electronic health records were only effective among those predisposed to believe them. Among those for whom the false information was attitude affirming, embedded corrections proved less effective than corrections presented at a later time (Garrett & Weeks, 2013).

There may be other strategies that are more effective for correcting misinformation at the point of exposure. For example, a pair of experiments found that when a Facebook post inaccurately characterizing GMO foods was followed by corrections labeled as “related” by Facebook, recipients’ beliefs became more accurate (Bode & Vraga, 2015). However, on the more contentious vaccines-cause-autism misperception, there was no effect. Another approach that has shown some promise is to create tools that facilitate learning about controversial issues by presenting diverse perspectives and providing explicit credibility indicators (Vydiswaran, Zhai, Roth, & Pirolli, 2012). Finally, the MIT Climate CoLab strives to create a community of knowledgeable and committed citizen scientists who contribute to the fight against climate change, and who, perhaps more importantly, can help disseminate climate science knowledge in their own communities.

Conclusion

False beliefs about climate change are rampant, especially in the United States. This article has considered four broad sources that contribute to public misperceptions about climate change: motivated reasoning, information processing biases unrelated to identity, social dynamics, and the information environment. Each area is associated with a host of potential strategies for countering false beliefs. Efforts that focus exclusively on a particular corrective strategy or a specific source of misperceptions are bound to fail. There is no single “solution” to climate change misperceptions; instead, we must use the tools available to us to keep human biases in check.

It is also important to recognize that we will never achieve universal acceptance of climate science. Arriving at consensus about contested facts is notoriously difficult, even when the science is clear. The evidence that the earth revolves around the sun is indisputable, yet one in four Americans does not accept this fact (National Science Board, 2014). Politically important truths are even more challenging and are rarely straightforward or self-evident (Kuklinski & Quirk, 2001). Even individuals making a concerted effort to weigh evidence fairly and carefully can still reach different conclusions about what is true (Kahneman, 2011; Shibutani, 1966). The complexity of the climate science debate perfectly illustrates these challenges.

The vast majority of people on the planet have very limited access to the evidence on which climate change predictions are based; they lack the expertise to assess the evidence available and the knowledge required to form a coherent scientific understanding of the relevant phenomena; there are complex motivations shaping stakeholders’ claims; and the social costs of expressing beliefs that do not conform with group norms are high. In the face of these challenges, the strategies outlined here and elsewhere in this volume can help us to chart a way forward.

Suggested Readings

Cohen, G. L., & Sherman, D. K. (2014). The psychology of change: Self-affirmation and social psychological intervention. Annual Review of Psychology, 65(1), 333–371.Find this resource:

Cook, J., Ecker, U., & Lewandowsky, S. (2015). Misinformation and how to correct it. In R. Scott & S. Kosslyn (Eds.), Emerging trends in the social and behavioral sciences: An interdisciplinary, searchable, and linkable resource (pp. 1–17). Hoboken, NJ: John Wiley & Sons, Inc.Find this resource:

Goldman, A. I., & Whitcomb, D. (2011). Social epistemology: Essential readings. Oxford: Oxford University Press.Find this resource:

Kahan, D. M. (2015). Climate science communication and the measurement problem. Advances in Political Psychology, 36, 1–43.Find this resource:

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.Find this resource:

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.Find this resource:

Lodge, M., & Taber, C. S. (2013). The rationalizing voter. New York: Cambridge University Press.Find this resource:

Redlawsk, D. P., Civettini, A. J. W., & Emmerson, K. M. (2010). The affective tipping point: Do motivated reasoners ever “get it”?Political Psychology, 31(4), 563–593.Find this resource:

Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. In M. P. Zanna (Ed.), Advances in experimental social psychology (vol. 39, pp. 127–161). New York: Academic Press.Find this resource:

Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? In H. R. Brian (Ed.), Psychology of learning and motivation (vol. 41, pp. 265–292). New York: Academic Press.Find this resource:

van der Linden, S. L., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2015). The scientific consensus on climate change as a gateway belief: Experimental evidence. PLoS ONE, 10(2), e0118489.Find this resource:

References

Aklin, M., & Urpelainen, J. (2014). Perceptions of scientific dissent undermine public support for environmental policy. Environmental Science & Policy, 38, 173–177.Find this resource:

Anderegg, W. R. L., Prall, J. W., Harold, J., & Schneider, S. H. (2010). Expert credibility in climate change. Proceedings of the National Academy of Sciences, 107(27), 12107–12109.Find this resource:

Anderson, C. A. (1983). Abstract and concrete data in the perseverance of social theories: When weak data lead to unshakeable beliefs. Journal of Experimental Social Psychology, 19(2), 93–108.Find this resource:

Anderson, C. A. (2007). Belief perseverance. In R. Baumeister & K. D. Vohs (Eds.), Encyclopedia of social psychology (pp. 109–110). Thousand Oaks, CA: SAGE.Find this resource:

Bain, P. G., Hornsey, M. J., Bongiorno, R., & Jeffries, C. (2012). Promoting pro-environmental action in climate change deniers. Nature Climate Change, 2(8), 600–603.Find this resource:

Benestad, R. E., Nuccitelli, D., Lewandowsky, S., Hayhoe, K., Hygen, H. O., Dorland, R., et al. (2015). Learning from mistakes in climate research. Theoretical and Applied Climatology, 126(3), 699–703.Find this resource:

Berinsky, A. J. (2015). Rumors and health care reform: Experiments in political misinformation. British Journal of Political Science, First View, (Suppl.), 1–22.Find this resource:

Bliuc, A.-M., McGarty, C., Thomas, E. F., Lala, G., Berndsen, M., & Misajon, R. (2015). Public division about climate change rooted in conflicting socio-political identities. [Letter].Nature Climate Change, 5(3), 226–229.Find this resource:

Bode, L., & Vraga, E. K. (2015). In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication, 65(4), 619–638.Find this resource:

Borge-Holthoefer, J., & Moreno, Y. (2012). Absence of influential spreaders in rumor dynamics. Physical Review E, 85(2), 026116.Find this resource:

Boykoff, M. T., & Boykoff, J. M. (2004). Balance as bias: global warming and the US prestige press. Global Environmental Change, 14(2), 125–136.Find this resource:

Boykoff, M. T., & Roberts, J. T. (2007). Media coverage of climate change: Current trends, strengths, weaknesses. Human Development Report. United Nations Development Program.Find this resource:

Brehm, S. S., & Brehm, J. W. (1981). Psychological reactance: A theory of freedom and control. New York: Academic Press.Find this resource:

Brulle, R., Carmichael, J., & Jenkins, J. C. (2012). Shifting public opinion on climate change: An empirical assessment of factors influencing concern over climate change in the U.S., 2002–2010. Climatic Change, 114(2), 169–188.Find this resource:

Budak, C., Agrawal, D., & Abbadi, A. E. (2011). Limiting the spread of misinformation in social networks. Proceedings of the 20th International Conference on World Wide Web, Hyderabad, India.Find this resource:

Bullock, J. G., Gerber, A. S., Hill, S. J., & Huber, G. A. (2015). Partisan bias in factual beliefs about politics. Quarterly Journal of Political Science, 10(4), 519–578.Find this resource:

Byrne, S., & Hart, P. S. (2009). The “boomerang” effect: A synthesis of findings and a preliminary theoretical framework. Communication Yearbook, 33(1), 3–37.Find this resource:

Climate Feedback. (2016). Scientific reference to reliable information on climate change—climate feedback. Retrieved from http://climatefeedback.org/.

Cohen, G. L., Aronson, J., & Steele, C. M. (2000). When beliefs yield to evidence: Reducing biased evaluation by affirming the self. Personality and Social Psychology Bulletin, 26(9), 1151–1164.Find this resource:

Cohen, G. L., & Sherman, D. K. (2014). The psychology of change: self-affirmation and social psychological intervention. Annual Review of Psychology, 65(1), 333–371.Find this resource:

Cook, J. (2016). Countering climate science denial and communicating scientific consensus. In H. Von Storch (Ed.), Oxford research encyclopedia: Climate science. Oxford: Oxford University Press.Find this resource:

Cook, J., Ecker, U., & Lewandowsky, S. (2015). Misinformation and how to correct it. In R. Scott & S. Kosslyn (Eds.), Emerging trends in the social and behavioral sciences: An interdisciplinary, searchable, and linkable resource (pp. 1–17). Hoboken, NJ: John Wiley & Sons, Inc.Find this resource:

Cook, J., Nuccitelli, D., Green, S. A., Richardson, M., Winkler, B., Painting, R., et al. (2013). Quantifying the consensus on anthropogenic global warming in the scientific literature. Environmental Research Letters, 8(2), 024024.Find this resource:

Damasio, A. (2005). Descartes’ error: Emotion, reason, and the human brain. New York: Penguin.Find this resource:

Darmofal, D. (2005). Elite cues and citizen disagreement with expert opinion. Political Research Quarterly, 58(3), 381–395.Find this resource:

Delli Carpini, M. X., & Keeter, S. (1996). What Americans know about politics and why it matters. New Haven, CT: Yale University Press.Find this resource:

DiFonzo, N., Beckstead, J. W., Stupak, N., & Walders, K. (2016). Validity judgments of rumors heard multiple times: The shape of the truth effect. Social Influence, 11(1), 22–39.Find this resource:

DiFonzo, N., & Bordia, P. (2007). Rumor psychology: social and organizational approaches. Washington, DC: American Psychological Association.Find this resource:

DiFonzo, N., Suls, J., Beckstead, J. W., Bourgeois, M. J., Homan, C. M., Brougher, S., et al. (2014). Network structure moderates intergroup differentiation of stereotyped rumors. Social Cognition, 32(5), 409–448.Find this resource:

Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism. Journal of Personality and Social Psychology, 63(4), 568–584.Find this resource:

Dixon, G. N., & Clarke, C. E. (2013). Heightening uncertainty around certain science: Media coverage, false balance, and the autism-vaccine controversy. Science Communication, 35(3), 358–382.Find this resource:

Dong, X., Gabrilovich, E., Heitz, G., Horn, W., Lao, N., Murphy, K., et al. (2014). Knowledge vault: A web-scale approach to probabilistic knowledge fusion. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and data Mining, New York.Find this resource:

Doran, P. T., & Zimmerman, M. K. (2009). examining the scientific consensus on climate change. Eos, Transactions American Geophysical Union, 90(3), 22–23.Find this resource:

Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18(3), 570–578.Find this resource:

Ecker, U. K. H., Lewandowsky, S., & Tang, D. (2010). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38(8), 1087–1100.Find this resource:

Ecker, U. K. H., Lewandowsky, S., Fenton, O., & Martin, K. (2014). Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Memory & Cognition, 42(2), 292–304.Find this resource:

Ennals, R., Byler, D., Agosta, J. M., & Rosario, B. (2010). What is disputed on the web?. Proceedings of the 4th workshop on Information credibility, Raleigh, North Carolina.Find this resource:

Ennals, R., Trushkowsky, B., & Agosta, J. M. (2010). Highlighting disputed claims on the web.. Proceedings of the 19th international conference on World wide web, Raleigh, North Carolina.Find this resource:

Feinberg, M., & Willer, R. (2013). The moral roots of environmental attitudes. Psychological Science, 24(1), 56–62.Find this resource:

Feldman, L. (2011). The opinion factor: The effects of opinionated news on information processing and attitude change. Political Communication, 28(2), 163–181.Find this resource:

Feldman, L., Hart, P. S., & Milosevic, T. (2015). Polarizing news? Representations of threat and efficacy in leading US newspapers’ coverage of climate change. Public Understanding of Science.Find this resource:

Feldman, L., Myers, T. A., Hmielowski, J. D., & Leiserowitz, A. (2014). The mutual reinforcement of media selectivity and effects: Testing the reinforcing spirals framework in the context of global warming. Journal of Communication, 64(4), 590–611.Find this resource:

Feygina, I., Jost, J. T., & Goldsmith, R. E. (2010). System justification, the denial of global warming, and the possibility of “system-sanctioned change”. Personality and Social Psychology Bulletin, 36(3), 326–338.Find this resource:

Friggeri, A., Adamic, L., Eckles, D., & Cheng, J. (2014). Rumor cascades. Paper presented at the Eighth International AAAI Conference on Weblogs and Social Media, North America. Ann Arbor, MI. Retrieved from http://www.aaai.org/ocs/index.php/ICWSM/ICWSM14/paper/view/8122.Find this resource:

Funk, C., & Rainie, L. (2015). Americans, politics and science issues. Washington, DC: Pew Research Center. Retrieved from http://www.pewinternet.org/2015/07/01/americans-politics-and-science-issues/.Find this resource:

Gaines, B. J., Kuklinski, J. H., Quirk, P. J., Peyton, B., & Verkuilen, J. (2007). Same facts, different interpretations: Partisan motivation and opinion on Iraq. Journal of Politics, 69(4), 957–974.Find this resource:

Garrett, R. K., Nisbet, E. C., & Lynch, E. K. (2013). Undermining the corrective effects of media-based political fact checking? The role of contextual cues and naïve theory. Journal of Communication, 63(4), 617–637.Find this resource:

Garrett, R. K., Weeks, B. E., & Neo, R. L. (2016). Driving a wedge between evidence and beliefs: How online ideological news exposure promotes political misperceptions. Journal of Computer-Mediated Communication, 21(5), 331–348.Find this resource:

Garrett, R. K., & Weeks, B. E. (2013, February 23–27). The promise and peril of real-time corrections to political misperceptions. Proceedings of the ACM 2013 conference on Computer Supported Cooperative Work (CSCW 2013), San Antonio, TX.Find this resource:

Garrett, R. K., & Stroud, N. J. (2014). Partisan paths to exposure diversity: Differences in pro- and counterattitudinal news consumption. Journal of Communication, 64(4), 680–701.Find this resource:

Goldman, A. I. (2011). Experts: Which ones should you trust? In A. I. Goldman & D. Whitcomb (Eds.), Social epistemology: Essential readings (pp. 109–133). Oxford: Oxford University Press.Find this resource:

Goldman, A. I., & Whitcomb, D. (2011). Social epistemology: Essential readings. Oxford: Oxford University Press.Find this resource:

Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96(5), 1029–1046.Find this resource:

Haidt, J., & Joseph, C. (2004). Intuitive ethics: how innately prepared intuitions generate culturally variable virtues. Daedalus, 133(4), 55–66.Find this resource:

Hannak, A., Margolin, D., Keegan, B., & Weber, I. (2014). Get back! You don’t know me like that: The Social mediation of fact checking interventions in Twitter conversations. Paper presented at the Eighth International AAAI Conference on Weblogs and Social Media, North America. Ann Arbor, MI.Find this resource:

Harris, P. R., & Napper, L. (2005). Self-affirmation and the biased processing of threatening health-risk information. Personality and Social Psychology Bulletin, 31(9), 1250–1263.Find this resource:

Hibbing, J. R., Smith, K. B., & Alford, J. R. (2013). Predisposed: Liberals, conservatives, and the biology of political differences. New York: Taylor & Francis.Find this resource:

Hibbing, J. R., Smith, K. B., & Alford, J. R. (2014). Differences in negativity bias underlie variations in political ideology. Behavioral and Brain Sciences, 37(3), 297–307.Find this resource:

Hmielowski, J. D., Feldman, L., Myers, T. A., Leiserowitz, A., & Maibach, E. (2014). An attack on science? Media use, trust in scientists, and perceptions of global warming. Public Understanding of Science, 23(7), 866–883.Find this resource:

Hodson, H. (2015, February25). Google wants to rank websites based on facts not links. New Scientist, Retrieved from http://www.popsci.com/google-researchers-want-judge-websites-accuracy-not-popularity.Find this resource:

Iyengar, S., Hahn, K. S., Krosnick, J. A., & Walker, J. (2008). Selective exposure to campaign communication: The role of anticipated agreement and issue public membership. The Journal of Politics, 70(1), 186–200.Find this resource:

Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20(6), 1420–1436.Find this resource:

Jerit, J., & Barabas, J. (2012). Partisan perceptual bias and the information environment. The Journal of Politics, 74(03), 672–684.Find this resource:

Jones, M. D. (2014). Cultural characters and climate change: How heroes shape our perception of climate science. Social Science Quarterly, 95(1), 1–39.Find this resource:

Jones, M. D., & Song, G. (2014). Making sense of climate change: How story frames shape cognition. Political Psychology, 35(4), 447–476.Find this resource:

Joslyn, M. R., & Haider-Markel, D. P. (2014). Who knows best? Education, partisanship, and contested facts. Politics & Policy, 42(6), 919–947.Find this resource:

Jost, J. T., & Amodio, D. M. (2012). Political ideology as motivated social cognition: Behavioral and neuroscientific evidence. Motivation and Emotion, 36(1), 55–64.Find this resource:

Kahan, D. M. (2010). Fixing the communications failure. Nature, 463(7279), 296–297.Find this resource:

Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection: An experimental study. Judgment and Decision Making, 8, 407–424.Find this resource:

Kahan, D. M. (2015). Climate science communication and the measurement problem. Advances in Political Psychology, 36, 1–43.Find this resource:

Kahan, D. M., Jenkins-Smith, H., & Braman, D. (2011). Cultural cognition of scientific consensus. Journal of Risk Research, 14(2), 147–174.Find this resource:

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.Find this resource:

Knobloch-Westerwick, S., Johnson, B. K., Silver, N. A., & Westerwick, A. (2015). Science exemplars in the eye of the beholder: How exposure to online science information affects attitudes. Science Communication, 37(5), 575–601.Find this resource:

Koteyko, N., Jaspal, R., & Nerlich, B. (2013). Climate change and “climategate” in online reader comments: a mixed methods study. The Geographical Journal, 179(1), 74–86.Find this resource:

Kuklinski, J. H., & Quirk, P. J. (2000). Reconsidering the rational public: Cognition, heuristics, and mass opinion. In A. Lupia, M. D. McCubbins, & S. L. Popkin (Eds.), Elements of reason: Cognition, choice, and the bounds of rationality (pp. 153–182). Cambridge, U.K.: Cambridge University Press.Find this resource:

Kuklinski, J. H., & Quirk, P. J. (2001). Conceptual foundations of citizen competence. Political Behavior, 23(3), 285–311.Find this resource:

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.Find this resource:

Leiserowitz, A., Maibach, E., Roser-Renouf, C., Feinberg, G., Rosenthal, S., & Marlon, J. (2014). Climate change in the American mind: American’s global warming beliefs and attitudes in November 2013. New Haven, CT: Yale Project on Climate Change Communication.Find this resource:

Leiserowitz, A. A., Maibach, E. W., Roser-Renouf, C., Smith, N., & Dawson, E. (2013). Climategate, public opinion, and the loss of trust. American Behavioral Scientist, 57(6), 818–837.Find this resource:

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.Find this resource:

Lewandowsky, S., Gignac, G. E., & Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS ONE, 8(10), e75637.Find this resource:

Lewandowsky, S., Gignac, G. E., & Vaughan, S. (2013). The pivotal role of perceived scientific consensus in acceptance of science. Nature Climate Change, 3(4), 399–404.Find this resource:

Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA faked the moon landing—therefore, (climate) science is a hoax: An anatomy of the motivated rejection of science. Psychological Science, 24(5), 622–633.Find this resource:

Liu, B. S., & Ditto, P. H. (2013). What dilemma? Moral evaluation shapes factual belief. Social Psychological and Personality Science, 4(3), 316–323.Find this resource:

Lodge, M., & Taber, C. S. (2013). The rationalizing voter. New York: Cambridge University Press.Find this resource:

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109.Find this resource:

Luna Dong, X., Gabrilovich, E., Murphy, K., Dang, V., Horn, W., Lugaresi, C., et al. (2015). Knowledge-based trust: Estimating the trustworthiness of web sources. ArXiv e-prints. Retrieved from http://arxiv.org/abs/1502.03519.Find this resource:

Malka, A., Krosnick, J. A., Debell, M., Pasek, J., & Schneider, D. (2009). Featuring skeptics in news media stories about global warming reduces public beliefs in the seriousness of global warming. Retrieved from https://woods.stanford.edu/sites/default/files/files/Global-Warming-Skeptics-Technical-Detail.pdf.

Miller, J. M., Saunders, K. L., & Farhart, C. E. (2015). Conspiracy endorsement as motivated reasoning: The moderating roles of political knowledge and trust. American Journal of Political Science, 60(4), 824–844.Find this resource:

Möllering, G. (2009). Leaps and lapses of faith: exploring the relationship between trust and deception. In B. Harrington (Ed.), Deception: From ancient empires to Internet dating (pp. 137–153). Stanford, CA: Stanford University Press.Find this resource:

Mooney, C. (2012). The Republican brain: The science of why they deny science—and reality. Hoboken, NJ: Wiley.Find this resource:

Munro, G. D., Ditto, P. H., Lockhart, L. K., Fagerlin, A., Gready, M., & Peterson, E. (2002). Biased assimilation of sociopolitical arguments: Evaluating the 1996 U.S. presidential debate. Basic and Applied Social Psychology, 24(1), 15–26.Find this resource:

Myers, T. A., Maibach, E., Peters, E., & Leiserowitz, A. (2015). Simple messages help set the record straight about scientific agreement on human-caused climate change: The results of two experiments. PLoS ONE, 10(3), e0120985.Find this resource:

Nai, A., Schemeil, Y., & Marie, J.-L. (2016). Anxiety, sophistication, and resistance to persuasion: Evidence from a quasi-experimental survey on global climate change. Political Psychology, 38(1), 137–156.Find this resource:

Nam, H. H., Jost, J. T., & Van Bavel, J. J. (2013). “Not for all the tea in China!” Political ideology and the avoidance of dissonance-arousing situations. PLoS ONE, 8(4), e59837.Find this resource:

National Science Board. (2014, February). Science and engineering indicators 2014 (NSB 14-01). Arlington, VA: National Science Foundation.Find this resource:

Nisbet, E. C., Cooper, K. E., & Garrett, R. K. (2015). The partisan brain: How dissonant science messages lead conservatives and liberals to (dis)trust science. ANNALS of the American Academy of Political and Social Science, 658(1), 36–66.Find this resource:

Nisbet, M. C. (2014). Engaging in science policy controversies: Insights from the US climate change debate. In M. Bucchi & B. Trench (Eds.), Routledge handbook of public communication of science and technology (pp. 173–185). New York: Routledge.Find this resource:

Nisbet, M. C., & Kotcher, J. E. (2009). A two-step flow of influence?: Opinion-leader campaigns on climate change. Science Communication, 30(3), 328–354.Find this resource:

Nyhan, B. (2010). Why the “death panel” myth wouldn’t die: Misinformation in the health care reform debate. The Forum, 8(1), Article 5.Find this resource:

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330.Find this resource:

Nyhan, B., & Reifler, J. (2011, September). Opening the political mind?: The effects of self-affirmation and graphical information on factual misperceptions. Retrieved from http://www.dartmouth.edu/~nyhan/opening-political-mind.pdf.

Nyhan, B., & Reifler, J. (2012). Misinformation and fact-checking: Research findings from social science. Washington, DC: New America Foundation.Find this resource:

Nyhan, B., & Reifler, J. (2015). The effect of fact-checking on elites: a field experiment on U.S. state legislators. American Journal of Political Science, 59(3), 628–640.Find this resource:

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133(4), e835–e842.Find this resource:

Nyhan, B., Reifler, J., & Ubel, P. A. (2013). The hazards of correcting myths about health care reform. Medical Care, 51(2), 127–132.Find this resource:

Oppenheimer, D. M. (2006). Consequences of erudite vernacular utilized irrespective of necessity: Problems with using long words needlessly. Applied Cognitive Psychology, 20(2), 139–156.Find this resource:

Oreskes, N., & Conway, E. M. (2010). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York: Bloomsbury.Find this resource:

Pasek, J., Stark, T. H., Krosnick, J. A., & Tompson, T. (2015). What motivates a conspiracy theory? Birther beliefs, partisanship, liberal-conservative ideology, and anti-black attitudes. Electoral Studies, 40, 482–489.Find this resource:

Pearson, A. R., & Schuldt, J. P. (2015). Bridging climate communication divides: Beyond the partisan gap. Science Communication, 37(6), 805–812.Find this resource:

Pingree, R. J., Brossard, D., & McLeod, D. M. (2014). Effects of journalistic adjudication on factual beliefs, news evaluations, information seeking, and epistemic political efficacy. Mass Communication and Society, 17(5), 615–638.Find this resource:

Prasad, M., Perrin, A. J., Bezila, K., Hoffman, S. G., Kindleberger, K., Manturuk, K., et al. (2009). “There must be a reason”: Osama, Saddam, and inferred justification. Sociological Inquiry, 79(2), 142–162.Find this resource:

Prior, M. (2007). Post-broadcast democracy: How media choice increases inequality and political involvement and polarizes elections. New York: Cambridge University Press.Find this resource:

Prior, M., Sood, G., & Khanna, K. (2015). You cannot be serious: The impact of accuracy incentives on partisan bias in reports of economic perceptions. Quarterly Journal of Political Science, 10(4), 489–518.Find this resource:

Rains, S. A. (2013). The nature of psychological reactance revisited: A meta-analytic review. Human Communication Research, 39(1), 47–73.Find this resource:

Redlawsk, D. P., Civettini, A. J. W., & Emmerson, K. M. (2010). The affective tipping point: Do motivated reasoners ever “get it”?Political Psychology, 31(4), 563–593.Find this resource:

Schwarz, N. (2012). Feelings-as-information theory. In P. A. M. V. Lange, A. W. Kruglanski, & E. T. Higgins (Eds.), Handbook of Theories of Social Psychology (vol. 1, pp. 289–308). Thousand Oaks, CA: SAGE.Find this resource:

Schwarz, N., & Clore, G. L. (2007). Feelings and phenomenal experiences. In A. W. Kruglanski & E. T. Higgins (Eds.), Social psychology: Handbook of basic principles (pp. 385–407). New York: Guilford Press.Find this resource:

Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. In M. P. Zanna (Ed.), Advances in experimental social psychology (vol. 39, pp. 127–161). New York: Academic Press.Find this resource:

Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? In H. R. Brian (Ed.), Psychology of Learning and Motivation (vol. 41, pp. 265–292). New York: Academic Press.Find this resource:

Sherman, D. A. K., Nelson, L. D., & Steele, C. M. (2000). Do messages about health risks threaten the self? Increasing the acceptance of threatening health messages via self-affirmation. Personality and Social Psychology Bulletin, 26(9), 1046–1058.Find this resource:

Shibutani, T. (1966). Improvised news: A sociological study of rumor. Indianapolis: Bobbs-Merrill.Find this resource:

Shin, J., Jian, L., Driscoll, K., & Bar, F. (2016). Political rumoring on Twitter during the 2012 US presidential election: Rumor diffusion and correction. New Media & Society.Find this resource:

Shook, N. J., & Fazio, R. H. (2009). Political ideology, exploration of novel stimuli, and attitude formation. Journal of Experimental Social Psychology, 45(4), 995–998.Find this resource:

Silverman, C. (2015). Lies, damn lies, and viral content. How news websites spread (and debunk) online rumors, unverified claims, and misinformation. Tow Center for Digital Journalism. Retrieved from http://towcenter.org/research/lies-damn-lies-and-viral-content/.Find this resource:

Smith, N., & Leiserowitz, A. (2014). The role of emotion in global warming policy support and opposition. Risk Analysis, 34(5), 937–948.Find this resource:

Southwell, B. G. (2013). Social networks and popular understanding of science and health: Sharing disparities. Baltimore: Johns Hopkins University Press.Find this resource:

Spence, A., & Pidgeon, N. (2010). Framing and communicating climate change: The effects of distance and outcome frame manipulations. Global Environmental Change, 20(4), 656–667.Find this resource:

Steele, C. M. (1988). The psychology of self-affirmation: Sustaining the integrity of the self. Advances in Experimental Social Psychology, 21(2), 261–302.Find this resource:

Sunstein, C. R. (2009). On rumors: How falsehoods spread, why we believe them, what can be done. New York: Farrar, Straus and Giroux.Find this resource:

Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202–227.Find this resource:

Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755–769.Find this resource:

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.Find this resource:

van der Linden, S. L., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2014). How to communicate the scientific consensus on climate change: plain facts, pie charts or metaphors?Climatic Change, 126(1), 255–262.Find this resource:

van der Linden, S. L., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2015). The scientific consensus on climate change as a gateway belief: Experimental evidence. PLoS ONE, 10(2), e0118489.Find this resource:

Vydiswaran, V. G. V., Zhai, C., & Roth, D. (2011). Content-driven trust propagation framework. Paper presented at the Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Diego, California.Find this resource:

Vydiswaran, V. G. V., Zhai, C., Roth, D., & Pirolli, P. (2012). BiasTrust: Teaching biased users about controversial topics. Paper presented at the Proceedings of the 21st ACM international Conference on Information and Knowledge Management, Maui, Hawaii.Find this resource:

Wänke, M. (2012). Almost everything you always wanted to know about ease of retrieval effects. In C. Unkelbach & R. Greifeneder (Eds.), The experience of thinking: How feelings from mental processes influence cognition and behavior (pp. 151–170). Hove, U.K.: Psychology Press.Find this resource:

Watts, D. J., & Dodds, P. S. (2007). Influentials, networks, and public opinion formation. Journal of Consumer Research, 34(4), 441–458.Find this resource:

Watts, M. D., Domke, D., Shah, D. V., & Fan, D. P. (1999). Elite cues and media bias in presidential campaigns: Explaining public perceptions of a liberal press. Communication Research, 26(2), 144–175.Find this resource:

Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of Communication, 65(4), 699–719.Find this resource:

Whitson, J. A., & Galinsky, A. D. (2008). Lacking control increases illusory pattern perception. Science, 322(5898), 115–117.Find this resource:

Notes:

(1.) Interested readers should also see the entry in this volume on “Countering Climate Science Denial and Communicating Scientific Consensus” and may want to consult the numerous recent reviews summarizing scholarship on correcting misperceptions more generally (e.g., Cook, Ecker, & Lewandowsky, 2015; Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012; Nyhan & Reifler, 2012; Silverman, 2015). Note that this article only considers research with direct bearing on beliefs; it does not include research on policy support or other climate change attitudes.

(2.) Backfire effects can also result from cognitive biases that are not motivated by a desire to arrive at a particular conclusion. These are discussed in the section “Non-Motivated Information-Processing Biases.”

(3.) Although these instantaneous and automatic heuristics are imperfect, it is worth noting that these shortcuts are often highly functional. Any belief not based on direct observation is, at some level, a leap of faith (Möllering, 2009); it relies on trust and the necessity of forming conclusions in the face of incomplete information. Among individuals who are unable to do this, who attempt to investigate each claim to its source, decision-making becomes an infinite regress toward an unobtainable end (Damasio, 2005). Although shortcuts are irrational, effective judgment may be impossible without them. Writing of automatic and effortless “System 1” cognition, Kahneman observers that it “is indeed the origin of much that we do wrong, but it is also the origin of most of what we do right—which is most of what we do” (2011, p. 416).

(4.) The tendency for beliefs to polarize among those most involved in the issue has also been observed in a host of other politically contentious issues (e.g., Jerit & Barabas, 2012; Lodge & Taber, 2013).

(5.) Increasing belief accuracy among Republicans was not associated with exposure to accurate information; it was solely the product of affirmation.

(6.) According to Moral Foundations Theory, purity/sanctity is a foundational principle on which moral systems develop, and its importance is evident across a diverse range of societies (Haidt & Joseph, 2004).

(7.) Although treated as a corrective strategy here, Kahan (2015) persuasively argues that the polarization of science knowledge evident in polling data is more accurately conceived of as a measurement problem. He asserts that the questions commonly used to assess knowledge about climate change in surveys do not elicit answers that correspond to what individuals know about the topic. Instead, the answers are an identity expression: conservatives deny established science as a means of asserting their political identity, regardless of their exposure to and recall of relevant science.

(8.) This approach will be less effective when scientific agreement is lower (e.g., 60% or 80%), but this is not an issue in the context of climate change (Aklin & Urpelainen, 2014).

(9.) The indirect path implied by this relationship is not tested.