Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, CLIMATE SCIENCE (climatescience.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 22 November 2017

Methods for Assessing Journalistic Decisions, Advocacy Strategies, and Climate Change Communication Practices

Summary and Keywords

Research in the field of journalistic decisions, advocacy strategies, and communication practices is very heterogeneous, comprising diverse groups of actors and research questions. Not surprisingly, various methods have been applied to assess actors’ motives, strategies, intentions, and communication behaviors. This article provides an overview of the most common methods applied—i.e., qualitative and quantitative approaches to textual analyses, interviewing techniques, observational and experimental research. After discussing the major strengths and weaknesses of each method, an outlook on future research is given. One challenge of the future study of climate change communication will be to account for its dynamics, with various actors reacting to one another in their public communication. To better approximate such dynamics in the future, more longitudinal research will be needed.

Keywords: advocates, stakeholders, climate scientists, strategic communication, qualitative content analysis, quantitative content analysis, quantitative surveys, qualitative interviews, sampling techniques, research methods

Contours of the Research Field

The actors publicly communicating about climate change engage in a highly politicized debate. Whether actors intend this or not, claims about climate change have a political dimension. The dynamics of the public debate about climate change and related policies can be outlined in the following assumptions, each raising particular research problems (Figure 1).

  1. 1. The public debate about climate change and related policies involves actors with various interests from different social domains—not only scientists, but actors from, for example, politics, the economy, civil society, organized interests, the church, corporations, the intelligentsia, and others. Authors (e.g., Matthews, 2015) have investigated who participates in the public debate and whose arguments are heard in the media. Others have studied actors’ goals and motives to engage in public communication (Ivanova, Schäfer, Schlichting, & Schmidt, 2013; Post, 2016; Tøsse, 2013), their attitudes toward media coverage (Bell, 1994; Ivanova et al., 2013; Post, 2008; Sharman & Howarth, 2016), or their lines of argument (e.g., McCright & Dunlap, 2000; Nisbet, 2014; Schlichting, 2013).

  2. 2. Public communication about climate science and related policies occurs online, and via the classical news media. Researchers investigating the online debate have asked, among other questions, who communicates online (Nisbet & Kotcher, 2009), what arguments are exchanged, and how online users—for example, bloggers—interconnect (Adam, Häussler, Schmid-Petri, & Reber, 2016; Elgesem, Steskal, & Diakopoulos, 2015; Elgin, 2015; Sharman, 2014; for an overview, see Schäfer, 2012).

  3. 3. In news coverage about climate change, journalists decide on the actors, arguments, and interpretations being heard in the news. Researchers investigating journalists’ roles in climate change communication have looked at their selections of news sources and information (Boykoff & Boykoff, 2004; Schmid-Petri, Häussler, & Adam, 2016), their assessments of anthropogenic climate change (Brüggemann & Engesser, 2014), or their mental frames of it (Engesser & Brüggemann, 2015).

  4. 4. News coverage and online debates influence their audiences. There is a vast amount of research on the effects of climate change communication on the general audience or the public. The methods to assess such effects are dealt with in other articles (see, e.g., “Survey Research Approaches for Assessing Public Opinion about Climate Change and the Effects of Messages and Media Portrayals” and “Experimental Approaches to Assessing Media and Message Effects on Public Opinion about Climate Change”).

  5. 5. More relevant to this survey, the public communication about climate change affects the individuals or members of groups of people involved in climate science or climate change policies (reciprocal effects, Kepplinger, 2007)—e.g., the climate scientists, politicians, interest groups or intellectuals engaged with climate science, climate change policy or the public debate about it. For example, researchers have investigated how actors adapt to the requirements of the media (e.g., formatting, scheduling in the news cycle) to attract media attention (Hassler, Maurer, & Oschatz, 2016; Kepplinger & Post, 2008)—a process called “medialization” (Weingart, Engels, & Pansegrau, 2000; see also Peters, in this volume). Some researchers have also studied how the public controversy about climate change has influenced climate scientists’ presentations of scientific results (Lewandowsky, Oreskes, Risbey, Newell, & Smithson, 2015), their public communication or strategic considerations (Post, 2016; Tøsse, 2013).

  6. 6. The different groups of actors do not act in isolation but in co-orientation. For instance, students of climate change communication have looked at the interfaces between science, politics, and the media (Weingart et al., 2000), at scientists’ attempts to demarcate their messages from those of industrial and environmental interest groups (Tøsse, 2013) or at actors’ motivations to meet the criteria of journalists to elicit news coverage of their works or causes (Hassler et al., 2016; Ivanova et al., 2013).

Due to the variety of actors and modes of communication, it is no surprise that diverse research methods have been applied to assess the communication practices of journalists and the social actors involved in the public debate. The goal of the following is to provide an overview of the research methods applied to work on the research problems outlined above. As most studies in the field are based on one of two distinct methodological approaches—i.e., qualitative or quantitative methodology—a brief outline of both methodologies will be given.

Methods for Assessing Journalistic Decisions, Advocacy Strategies, and Climate Change Communication PracticesClick to view larger

Figure 1. The dynamics of a mediated public debate.

Qualitative and Quantitative Approaches

Originating from different epistemological traditions (cf. Anderson, 2012; Lindlof & Taylor, 2010), quantitative and qualitative approaches to the study of climate change communication follow different methodological principles (Post, 2013) (Table 1). The goal of qualitative studies is to capture an open number of meaningful properties (qualities) of their research objects (e.g., media texts, actors’ views, or behaviors) and “to describe [them] in rich detail” (Gibson, Craig, Harper, & Alpert, 2015). To accomplish this, researchers consider single or a small number of cases and view them from a holistic, object-centered perspective, often applying flexible methods adjusting them to their research needs as their understandings of their research objects evolve (Anderson, 2012; Maxwell, 2005). The goal of quantitative studies, by contrast, is to capture the distributions of particular, predefined sets of characteristics (variables) in a relevant population and their relationships. To accomplish this, researchers consider large numbers of cases and view them from a highly selective, variable-centered perspective, applying predefined, standardized research methods (Anderson, 2012). Resulting from their goals and methodological traditions, qualitative and quantitative approaches follow fundamentally different logics of explanations. Qualitative research rests upon Max Weber’s approach to the study of people’s actions by interpreting them in their social context and making them understandable. Qualitative researchers accomplish this by describing in detail how events or actions come about. Quantitative research, by contrast, ultimately seeks to explain social phenomena by discovering regular relationships between variables and developing theories “enabling the researcher to predict from certain variables to certain other variables” (Kerlinger & Lee, 2000, p. 11).

Table 1. Core differences between qualitative and quantitative methodologies.

Approach

Qualitative

Quantitative

Goal

capture an open number of meaningful properties (qualities) of their research objects

capture the distributions of particular, predefined sets of characteristics and their relationships

Basis of studies

small number of cases

large number of cases

Perspective

holistic, object-centered

selective, variable-centered

Logic of explanation

understanding social phenomena and social actors in context, developing taxonomies representing them

discovering relationships between variables, thereby developing theories and making predictions

Strengths

richness in information

generalizability

Limitations

context-dependence

selective accounts

With their different goals and approaches, qualitative and quantitative research methods possess particular strengths and weaknesses. One of the major strengths of the qualitative approach is its richness in information, providing detailed accounts of social phenomena enabling researchers to understand important underlying processes of social behavior. The richness of information is a property that, with its focus on predetermined variables, quantitative research often lacks, running the risk of overlooking information that may be relevant to fully grasp a social phenomenon. The strength of quantitative research, by contrast, is its generalizability. Due to its focus on the invariant structures among the members of relevant populations, quantitative research produces results that apply not only to a few cases under investigation but to all the members of a population. The generalizability of their findings is a property that qualitative approaches, with their focus on small numbers of research cases, lack (Gibson et al., 2015).

In the following, the most common qualitative and quantitative research methods for assessing journalists’ news decision in their coverage of climate change as well as advocates’ communication strategies and practices will be outlined. Examples from existing research will be given to illustrate each method.

Selecting Cases

Answering their research questions or testing their hypotheses, researchers have to select the cases they want to base their investigations on. Investigating climate change communication, for example, they have to select the journalists, climate scientists, or other social actors they want to interview or the content they want to analyze—the news coverage, social media content, corporate publications, and so forth. In qualitative and quantitative communication research, cases are selected with different goals and according to different principles. In qualitative study designs, researchers apply purposeful selection, looking for cases that are typical in the sense that they possess important characteristics in a very marked way. Purposeful selection “is a strategy in which particular settings, persons, or activities are selected deliberatively in order to provide information that can’t be gotten as well from other choices” (Maxwell, 2005, p. 88). It requires that certain sets of selection criteria be defined. For example, in their study of U.S. environmental journalists’ presentations of scientific evidence related to climate change, Hiles and Hinnant (2014) selected 11 journalists according to their occupation within a certain type of medium (print newspapers directed at the general audience), their newsbeat (environmental), and their professional experience (high). The authors reasoned that the most experienced journalists had internalized and were aware of prevailing journalistic norms more than less experienced journalists and thus could be more informative for their study objectives.

In quantitative study designs, selection procedures are aimed at drawing representative samples. In a representative sample, the distributions of and relationships between the properties of its elements (e.g., age, weight, gender, climate skepticism) are the same as in the population it represents. When research findings are based on representative samples, they can be generalized to the population. Representativeness can be approximated using random sampling. Random sampling ensures that every possible sample of a population has an equal chance of being drawn from a population (Kerlinger & Lee, 2000). One requirement of “the principle of randomization, or simply randomness ‘to work’” (Kerlinger & Lee, 2000, p. 177) is to draw sufficiently large samples. In general, the smaller a sample is the greater is the likelihood that its properties deviate from those of the population (for a detailed account of the requirements and types of sampling techniques, see Kerlinger & Lee, 2000).

Before drawing a sample, researchers must define and identify the population they want to investigate. Depending on the population under study, this can be quite challenging and involves various systematic steps. For example, in their study of U.S. environmental journalists’ conceptions of their role, Tandoc and Takahashi (2014) aimed at making a full survey of all environmental journalists working in the United States. They reasoned that most environmental journalists are members of the Society of Environmental Journalists (SEJ) and invited all its list serve subscribers to participate in the survey. In their email, they made clear that the survey was addressed to environmental journalists, not to other members, such as academics. To be able to identify and eliminate invalid cases—e.g., survey responses from academics or occasional writers—the authors included filter questions in their survey (asking respondents about their occupation, status, etc.). Acknowledging that the subscribers of the SEJ’s list serve might not provide an exhaustive list of U.S. environmental journalists, the authors also identified and contacted all the environmental journalists listed in the News Media Yellow Book, a personnel directory of U.S. news media organizations. In order to avoid duplicate responses, they made clear in their email requests that the same request had been sent out via the SEJ. Although there may be environmental journalists not reached by this method, it appears to be a reasonable approximation to the population the authors sought out to study. It is an illustrative example of a solution to a practical research problem that often occurs in the field of journalists’ coverage and advocates’ communication about climate change where many relevant but more or less loosely defined groups participate, such as climate skeptics, climate scientists, and many more (for the problem of defining and identifying populations of climate scientists, see, e.g., Ivanova et al., 2013; Post, 2008, 2016).

An important factor limiting the representativeness in survey research is response turnout. One can assume that there are differences between the participating and refusing members of a population or sample, for example with regard to their workload, engagement, involvement, distrust, and other properties. This is why researchers should do their best to achieve high response rates—e.g., by personalizing their requests, writing reminder messages, illustrating the relevance of the study to their target persons, and so forth (Dillman, 2000). Researchers should also discuss their turnout and how their sample of participants might differ from the population it was drawn from—e.g., by comparing the distributions of certain key properties within their outcome sample and the population.

Purposeful selection and representative sampling or quantitative research are not exclusive in communication research. Many quantitative studies apply purposive sampling. For example, in a study of visual media framings in the global coverage of two annual climate change conferences (Conferences of the Parties, COPs), Wozniak, Wessler, and Lück (2016, p. 4) purposefully chose “the politically and economically most important democratic country from each of the five major continents, respectively: Brazil, Germany, India, South Africa, and the United states.” The authors then “selected the two most widely circulated daily newspapers from each of these countries based on their functional equivalence as opinion-forming newspapers” (Wozniak et al., 2016, p. 5). From the media, they then identified all the articles covering the respective COP and containing visuals to conduct a manual quantitative content analysis. In this particular case, the researchers were able to code all the articles in their research material covering the respective COP. In many studies, researchers have too much material to conduct a manual content analysis of all the relevant articles. In such cases, they draw random samples of articles published in the media they purposefully selected to keep their fieldwork manageable. Two frequently applied selection principles in internationally comparative research are most different systems and most similar systems designs. Researchers choose the most different systems design to explain an identical outcome among very different countries, and they choose the most similar systems design to explain a different outcome among very similar countries (Wessler & Brüggemann, 2012, pp. 33–34).

Selection and sampling procedures are the first steps in qualitative and quantitative studies of journalists’, climate scientists’, and advocates’ public communication about climate change. Depending on their study goals, researchers apply different methods to investigate communication behaviors or actors’ views, attitudes, goals, or intentions. The following will provide an overview of methods, first, to analyze message contents and, second, to interview relevant actors.

Analysis of Documents and Media Contents

Analyses of documents and media contents are an important method in the study of actual communication behavior. They have been used to study journalists’ news decisions and foci in reporting on climate change (e.g., Boykoff & Boykoff, 2004; Konieczna, Mattis, Tsai, Liang, & Dunwoody, 2014), as well as other social actors’ communication about climate change, such as those of politicians (e.g., Hassler et al., 2016; Weingart et al., 2000), industry actors (e.g., Schlichting, 2013), NGOs (cf. Schmidt, 2012), bloggers (Schmid-Petri et al., 2016), or public intellectuals (Nisbet, 2014). Students of climate change communication analyze documents and media contents for various other purposes—to study the media discourse, link the contents of media coverage with the audience’s views or the political agenda, and many more. These topics are dealt with in the respective chapters of this volume (see, e.g., Metag, in this volume). The following discusses techniques of document and media content analysis as a way to assess communication behaviors of journalists, climate scientists, and advocates.

Qualitative Content and Discourse Analysis

There are a number of different qualitative approaches used to study message content. One is qualitative content analysis. Although there are different conceptions of qualitative content analysis among several authors (cf. Schreier, 2012), it can probably be said that of all the qualitative approaches to textual analysis, it is the most systematic (Anderson, 2012; Schreier, 2012). Anderson (2012) outlines the following steps for its application. The first, after formulating a research question and selecting the study material, is close reading, i.e., “the reading of, viewing of, or listening to the set of texts multiple times … with the goal of understanding the range and quality of content” involving “the identification of embedded values, internal and external references, and symbolic resonances” (Anderson, 2012, p. 288). The second step is coding—the classification of meaningful text elements into an emerging system of categories. Through coding, concrete textual elements are captured as specific manifestations of a category, and thus the data of the original text is reduced (Schreier, 2012). The final analysis is the “product of multiple cycles through the text,” in which “new codes are added; old codes are refined, divided, and discarded” (Anderson, 2012, p. 290). Scholars of climate change communication have applied qualitative content analysis to study how particular frames or meanings emerge from an unlimited number of textual properties such as visuals, headlines, types of language, figurative speech, and many more. For example, Bourk, Rock, and Davis (2015) studied how journalists, scientists and politicians coconstruct representations of climate change in the news by analyzing 13 New Zealand TV news reports on climate change. To study the different actors’ roles in creating the prevalent news frames, the authors took into account various meaningful elements such as the actors’ arguments, journalists’ interrogations, their thematic foci, and others.

Another qualitative method of textual analysis is discourse analysis. Discourse analyses are based on the assumption that, by their use of language, actors construct social reality. The goal of discourse analysis is to study the way this happens by analyzing figurative speech (e.g., metaphors), rhetoric, argument structure, or syntax (Schreier, 2012). Most authors have applied this method to analyze media representations of climate change (e.g., Carvalho 2005; Olausson 2009; for an overview see Metag, in this volume). More relevant to this review are the works of researchers who applied it to analyze various kinds of other sources to study journalists’ news decisions or other social actors’ communication strategies in the discourse on climate change. A prominent example is an analysis of the German discourse on climate change among and between scientists, journalists, and politicians from the 1970s through the 1990s (Weingart et al., 2000). The authors analyzed influential scientific publications and press releases, minutes from political debates in the Bundestag and parliamentary working groups, as well as about 500 articles in opinion-leading German news media. They analyzed how the discourse about climate change developed and transmitted from science to the media and to politics. As is characteristic of qualitative research, they did not work with a predefined set of variables but considered all the meaningful features of their research material—e.g., magazine covers, figurative speech (such as metaphors, similes), headlines, images, arguments, and so forth.

Quantitative Content Analysis

When large numbers of messages exist, researchers might not be interested in capturing an undefined number of meaningful properties but in learning how particular, predefined properties are distributed or associated in a given universe of texts (e.g., a decade’s news coverage of climate change in a particular news outlet). In such cases, researchers apply quantitative content analysis. An example is a study of the influence of the journalistic norm of balance in news coverage of climate change. Boykoff and Boykoff (2004) assumed that the journalistic norm of balance—i.e., the rule to give a voice to both sides in a conflict (Tuchman, 1972)—would influence journalists’ presentations of scientific evidence of human-induced climate change and tested this assumption in a quantitative content analysis.

A quantitative content analysis requires several preparatory steps. First, the researchers have to clarify what media they want to base their study on and, if the number of news reports is too large for manual analysis, they have to draw a sample. Second, the researchers have to “unitize” (Krippendorff, 2012). That is, they have to define their units of sampling, analysis, and coding. Sampling units are “units that are distinguished for selective inclusion in an analysis” (Krippendorff, 2012, p. 98). For instance, using several databases, Boykoff and Boykoff (2004) chose to draw a sample of news articles published in the New York Times, the Los Angeles Times, the Washington Post, and the Wall Street Journal from 1998 through 2002. Depending on the purpose of a study, other possible sampling units could be, for instance, newspaper issues, or images. Units of analysis are “those elements of the study material that will be classified in the coding process” (Rössler, 2010, p. 70).1 For instance, Boykoff and Boykoff (2004) chose to capture some specified properties of each article. Among others, they documented what views on human-induced climate change existed in each article—arguments confirming that anthropogenic global warming exists, arguments questioning it, or arguments of both sorts. Depending on the purpose of a study, the unit of analysis could be much smaller than the article. For instance, one could choose to classify each actor in an article according to a given set of categories (one could, for instance, classify each actor’s professional office and country of origin). Codings units are “units that are distinguished for separate … coding” (Krippendorff, 2012, p. 99) within a given unit of analysis (Rössler, 2010). For instance, in their study, Boykoff and Boykoff (2004) captured not only the degree of balance in reporting in their units of analysis (i.e., each article) but also the date of publication and the newspaper it was published in. In this example, the degree of balance, publication date, and news outlet are distinct coding units.

During fieldwork of a quantitative content analysis, several “coders” (analysists) analyze the predefined units of analysis (e.g., articles, pictures, arguments) according to a standardized procedure. More specifically, they apply a system of categories and codes that the principle investigators have specified in a “codebook”—i.e., a manual that defines a set of rules according to which the coding units are to be classified. Before the actual fieldwork can start, researchers have to train their coders to apply the coding rules reliably—i.e., in a consistent way (for details, see Metag, in this volume). Whereas for researchers conducting qualitative content analyses, reliability is a secondary concern (Schreier, 2012), researchers conducting quantitative content analyses want to make sure that their coders’ classifications converge as much as possible. This is to ensure the intersubjectivity of their research results (Krippendorff, 2012; Rössler, 2005). In recent years, communication researchers have increasingly used computer-assisted or automated procedures to supplement or substitute the manual coding of textual units. These approaches are described in detail elsewhere in this volume (e.g., Metag, in this volume).

Hyperlink Network Analysis

The growth of the Internet and Web 2.0 has given rise to new methods to analyze stakeholder communication about climate change. Methods for assessing stakeholders’ use of blogs to communicate about climate change have become particularly prominent (cf. Elgesem et al., 2015). Researchers have applied hyperlink network analysis to map networks of bloggers participating in online discourse about climate change and to depict their interconnectedness (Adam et al., 2016; Elgin, 2015). To accomplish this, the researchers have used crawling software to identify thematically relevant blogs on the Internet and to establish their interconnectedness. Most crawling software needs crawling seeds, i.e., links of blogs functioning as starting points from which the software follows all the hyperlinks in the blog rolls or blog posts via a snowball method (Elgin, 2015; Sharman, 2014). In order to ensure identification of thematically relevant blogs, researchers have added keywords as a prerequisite for a blog to be picked up by the crawler (Elgesem et al., 2015). In addition, researchers usually manually go through all the blogs identified to exclude irrelevant or inactive blogs (Elgesem et al., 2015; Sharman, 2014). Some researchers have used hyperlink network analysis as a method to identify climate blogs to conduct content analyses (Adam et al., 2016; Elgesem et al., 2015) and to map online communication structures identifying communities of bloggers and their interconnectedness (Elgesem et al., 2015; Sharman, 2014). Other researchers have analyzed data from Twitter, e.g. to identify the most prevalent sources or the salient contents (e.g., Newman, 2016; O’Neill, Williams, Kurz, Wiersma, & Boykoff,2015) (for more information on this method, see “Research Methods for Assessing Online Climate Change Communication, Social Media Discussion, and Behavior”).

Interviews and Surveys

Another relevant method for assessing journalists’ or advocates’ climate change communication is interviewing. Students of climate change communication have applied interviewing techniques to a broad range of research problems—e.g., to assess climate journalists’ understanding and interpretations of climate change (e.g., Brüggemann & Engesser, 2014; Engesser & Brüggemann, 2015, Gibson et al., 2015; Hiles & Hinnant, 2014); climate scientists’ attitudes toward media coverage (e.g., Ivanova et al., 2013; Post, 2008, 2016) or their experiences with the media (Sharman & Howarth, 2016; Tøsse, 2013); political advocates’ goals and motives (e.g., Sharman & Howarth, 2016); or the communication strategies of representatives of NGOs or multinational corporations (Hestres, 2015; Lück, Wozniak, & Wessler, 2016; cf. Schlichting, 2013). Broadly speaking, one can distinguish interviewing techniques according to their mode and their degree of standardization. The mode refers to the channel through which interviews are conducted—e.g., in person, by phone, by mail, or via the Internet. The degree of standardization refers to the extent to which interactions between interviewers and interviewees are flexible or follow a set procedure. Researchers conducting qualitative research apply more or less unstandardized, while researchers conducting quantitative research apply more or less standardized procedures. Regardless of the approach, interviewing is a reactive method. It influences subjects, e.g., by making them think about questions they have never been asked. In a field as politicized as the debate on climate change, participants’ reactivity is a challenge and can be a source of bias. For example, because of their fear of criticism, respondents might be unusually vulnerable to give answers they consider socially desirable or to adapt their responses to their anticipations of the interviewers’ expectations. Many researchers studying journalists’ or stakeholders’ communication strategies have reflected on this challenge in their research (Hiles & Hinnant, 2014, p. 8; Post, 2016; Sharmann & Howarth, 2016). The following will outline the different techniques and provide examples of how researchers have applied them to assess journalists’ news decisions, advocacy strategies, and communication practices in the public debates about climate change.

Qualitative Interviewing

Researchers apply qualitative interviews to explore and understand actors’ salient motives, goals, behavioral intentions, views, and so forth. Qualitative interviewing is particularly suitable to study socially sensitive topics because it enables researchers to go into subjects’ responses in a flexible way—thereby creating a familiar atmosphere. According to Lindlof and Taylor (2010, p. 173), researchers conduct qualitative interviews, among other things, for the purpose of “understanding the social actor’s experience and perspective through stories, accounts, and explanations.” Investigating communication about climate change, many researchers have applied semistructured interviews with 10 to 20 subjects (e.g., Berglez, 2011; Gibson et al., 2015; Hestres, 2014, 2015; Hiles & Hinnant, 2014; Tøsse, 2013). In semistructured interviews, interviewers ask the interviewees mostly open-ended questions and follow a written guideline that specifies the questions to be asked. At the same time, the interviewers attempt to give the conversation a natural flow—e.g., by flexibly adapting the question order, reformulating questions, asking interviewees follow-up questions, etc. Semistructured interviews can be lengthy—taking from about 15 to 120 minutes, or more. Researchers usually record all the interviews, transcribe them verbatim, and then apply a qualitative content analysis to reduce and systematize the data. For example, Hiles and Hinnant (2014) interviewed 11 U.S. journalists relying on a guide with questions designed to explore the journalists’ news selection and their views on presenting scientific evidence. Based on respondents’ answers and previous research, the authors developed a classification system to analyze interview transcripts, differentiating three approaches to presenting scientific evidence: (1) “Quoting Authoritative Sources, Facticity, Avoiding Opinion in Newswriting,” (2) “Balance, Impartiality, Neutrality, and Fairness,” and (3) “Transparency” (Hiles & Hinnant, 2014, pp. 10–11).

One of the strengths of qualitative interviewing is the flexible, personal interaction between researchers and interviewees. This is why, in the highly politicized context of climate change communication, qualitative interviews have been considered as a means to enhance familiarity and build trust between the interviewer and the interviewee—thereby, reducing the problem of response bias due to fear of criticism or anticipations of interviewers’ expectations (Hiles & Hinnant, 2014; Sharman & Howarth, 2016). In the public debate about climate change, the purpose of qualitative interviewing is to gather “rich data” (Hestres, 2014, p. 328) about journalists’, climate scientists’, or advocates’ experiences with and beliefs about the public debate on climate change. The downside of this approach is its lack of generalizability. While researchers may identify particular lines of thinking or argument structures among members of specific communities, they cannot assess how widely they are distributed.

Quantitative Survey Research

To arrive at generalizable findings about populations of journalists’ or advocates’ communication strategies, motives, attitudes, goals, or intentions, researchers use quantitative surveys. Interviews in survey research are usually completely standardized, with a strictly set order of questions and response items. For this purpose, researchers develop questionnaires mostly using closed-ended questions—i.e., questions with explicit options for the respondents to select their answers from. These questions are designed to create data that is easily comparable across many cases and thus quantifiable. Survey questions can be designed to measure particular properties with varying differentiation. For instance, questions can be designed as categorical questions (e.g., as yes-or-no questions) or as rating questions, asking respondents to indicate their degree of acceptance or rejection of an issue on a scale. For example, in their internationally comparative survey of climate journalists, Brüggemann and Engesser (2014) asked their participants, among other things, how to treat sources that question climate change in news coverage. Using a five-point rating scale from “do not agree at all” to “fully agree,” the authors asked the journalists to judge, among other things, if skeptics should be critically assessed in or excluded from news coverage. Ivanova et al. (2013) applied a rating scale to measure the frequency with which climate scientists’ were in touch with the media. The authors asked the scientists to estimate the number of their contacts with different media (e.g., radio, TV, popular science magazines), providing response options from “none” or “once” to “2 to 5 times” or “more than 25 times.” From this information, the authors calculated an additive index indicating climate scientists’ media contacts.

In general, questionnaires must be as comprehensible as possible and comfortable for respondents to answer in order to avoid response bias, e.g., due to respondents’ fatigue. In politicized or controversial fields, researchers have to be wary of their language and avoid terms that are emotionally charged. For example, in the public debate on climate change, though being used frequently in the media, the terms “skeptic” or “alarmist” are too controversial and may evoke resentments among respondents. In a questionnaire, researchers should distance themselves from value-laden terms by circumscribing or quoting them (e.g., using “so-called skeptics”). In order to find appropriate language and construct an adequate questionnaire, researchers need to become familiar with their target communities through background research and should pretest their measurement instruments carefully (Noelle-Neumann & Petersen, 2013).

Aside from general surveys, researchers have applied specific research designs based on survey methods. Bell (1994) employed the design of a classical accuracy study (for accuracy studies see, e.g., Blankenburg, 1970; Charnley, 1936; Pulford, 1976) to assess climate scientists’ attitudes toward media coverage. He identified climate scientists cited in New Zealand newspaper reports over six months in 1988. To each of the scientists identified he sent the news report that cited them and asked them to assess its accuracy.

Rice, Hernderson-Sellers, and Walkerden (2015) conducted surveys in a comparative design, assessing in what ways scientists’ and journalists’ views of media coverage converge or differ. For example, they asked a sample of journalists and scientists if or how the media had influenced climate change policy, or about journalists’ motives for covering errors of the Intergovernmental Panel on Climate Change (IPCC) and alleged misconduct in climate research during the so-called Climate-gate affair. Peters and Heinrichs (2005) made an insightful addition to such a study design investigating the interfaces between science and journalism. They asked scientists and journalists about their views of the two professions to compare their self-conceptions and expectations toward one another (for this approach, see also Peters, 1995).

Post (2016) conducted an experiment imbedded in a mail paper-and-pencil survey of German climate scientists holding tenured professorships (survey experiment). The author randomly presented one of two versions of a fictitious research finding to the participants. In one version, the finding suggested that climate change would proceed faster than expected; in the other version, it suggested that climate change would proceed more slowly than expected. The author then asked the climate scientists to assess several objections to publishing the respective research result in the media and compared them across the two versions. The author reasoned that survey experiments might be an appropriate way to detect tendencies of self-censorship—a phenomenon that, due to its normative implications and people’s tendencies to give socially desirable self-reports, is difficult to capture.

Qualitative and quantitative interviewing are most frequently applied methods for assessing journalistic news decisions as well as stakeholder communication strategies. As with content analyses, both approaches have strengths and weaknesses. In future research, it might be fruitful to combine both approaches to test for the generalizability of qualitative research results on the one hand and to make plausible the findings of quantitative survey research—e.g., by giving participants’ verbatim accounts of their own views or experiences.

Case Studies and Mixed-Method Designs

Studying journalistic news decisions, stakeholders’ communication practices, and strategic assumptions, researchers have also applied case studies and multimethod designs. Case studies are “the preferred strategy when ‘how’ and ‘when’ questions are being posed, when the investigator has little control over the events, and when the focus is on a contemporary phenomenon within some real-life context” (Yin, 2003, p. 1). An example is the study by Henderson-Sellers (1998) of a case of public miscommunication about climate change, when the Australian media made a dramatic prediction of an increasing number of tropical cyclones as a consequence of climate change. As the source of this message was a modest expert statement expressing much uncertainty about the issue, the author set out to investigate the flow of information in this particular case interviewing people and analyzing documents.

There are quite a number of studies combining quantitative content analyses of news coverage of climate change and interviews with journalists or news sources. The purpose of these studies is to establish relevant properties of climate change coverage and to assess journalists’ attitudes toward news coverage and their reasons for their news decisions. For example, in a quantitative content analysis of Spanish TV and press reports, González (2014) found, among other things, that Spanish journalists rarely use sources and that their coverage generally lacks depth. The author conducted quantitative surveys of journalists and scientists to assess their perceptions of news coverage of climate change, their attitudes toward it, and their explanations for the nature of news coverage. Bourk et al. (2015) combined a qualitative frame analysis of New Zealand TV news on climate change and semistructured interviews with journalists as well as with policymakers and scientists serving as news sources. The authors asked the participants about their attitudes toward news coverage, their beliefs about its effectiveness, and their perceptions of changes of news coverage over the years. Similarly, León and Erviti (2015) combined a quantitative content analysis of TV news visuals with semistructured interviews with six journalists to assess the reasons behind editorial decisions.

Another frequently applied method to assess journalists’ news decision is to combine content analyses of news sources with content analyses of official documents. For instance, in the study of climate change communication, researchers have often conducted content analyses of the reports of the IPCC and used it as a standard to assess media coverage of climate change (e.g., Hassler et al., 2016). With a similar design, Oreskes (2004) compared media coverage of climate change with a content analysis of relevant scientific publications, to assess the accuracy of news reports.

Summary and Future Outlook

One of the characteristics of the public debate about climate change is that actors from various backgrounds engage with it. To study their communication behaviors and related goals, attitudes, views and cognitions, researchers have applied various methods (see Table 2 for an overview). To measure journalists’ news decisions, researchers have applied qualitative or quantitative content analysis. In order to assess journalists’ intentions, goals, and views of climate change, researchers have conducted qualitative interviews and quantitative surveys. To measure climate scientists’ communication behaviors, quantitative surveys have been used to capture respondents’ self-reports. In addition, Weingart et al. (2000) have applied a discourse analysis of scientific publications and press releases. In order to assess climate scientists’ behavioral dispositions, communication goals, attitudes toward climate change coverage and their strategic considerations, researchers have applied multiple quantitative surveys and semistructured interviews. To measure advocates’ communication behaviors, such as politicians’ online and offline communication, researchers have applied discourse analyses (Weingart et al., 2000) or quantitative content analyses (Hassler et al., 2016) of parliamentary speeches, party websites, or politicians’ TV statements. In addition, researchers have applied hyperlink network analyses to study bloggers’ online activities. In order to assess advocates’ communication goals and their strategic considerations, as well as their attitudes toward climate change and the media, researchers have predominantly relied on qualitative research techniques. One reason for this may be that populations of advocates are loose formations that may be difficult to grasp. Another reason may be that, in the highly politicized context of the climate debate, many advocates might be suspicious of social research and refuse to participate in surveys.

One challenge in the future research on journalists’ and advocates’ participation in the public debate about climate change will be to link actors’ goals, motives, attitudes, and preferences to their actual communication decisions and to draw causal inferences. Some researchers have attempted to establish the drivers of actors’ communication decisions by interviewing journalists or advocates about their preceding communication decisions. Although such a study design is surely insightful, one may question the degree to which actors provide valid reasons for their actions in hindsight. With a similar goal of explaining or understanding actors’ communication decisions, few researchers have conducted participant observations of journalists or advocates communicating about climate change (e.g., Saunders, 2007; Schlembach, 2011). In a participant observation, “a researcher takes part as a responsible agent in the actions of the participants” (Lindlof & Taylor, 2010, p. 135). In doing so, students of climate change communication have sought to gather data about advocates’ communication behaviors and their motives. However, because the data of participant observations are highly context dependent one cannot generalize them.

Table 2. Research methods applied to study journalists’, climate scientists’, and advocates’ communication behaviors, goals, attitudes, and strategic considerations.

Methodological approach

Actor

Variable

Qualitative

Quantitative

Journalists

Behavior

discourse analysis (e.g., Weingart et al., 2000; Bourk et al., 2015)

content analysis (e.g., Boykoff & Boykoff, 2004; Schmid-Petri et al., 2016)

Behavioral intentions/dispositions

semistructured interviews (e.g., Hiles & Hinnant, 2014)

surveys (e.g. Brüggemann & Engesser, 2014)

Goals and motives

semistructured interviews (e.g., Leon & Erviti, 2015)

surveys (e.g., Tandoc & Takahashi, 2014, Rice et al., 2015)

Views, attitudes

semistructured interviews (e.g., González, 2014)

surveys (e.g., Brüggemann & Engesser, 2014; Engesser & Brüggemann, 2015)

Strategic considerations

Climate scientists

Behavior

discourse analysis (e.g., Weingart et al., 2000), case study (e.g., Lewandowsky et al., 2015)

surveys (self-report measures, e.g., Ivanova et al., 2013; Post, 2016)

Behavioral intentions/dispositions

semistructured interviews (e.g., Tøsse, 2013)

survey-experiment (Post, 2016)

Goals and motives

semistructured interviews (Sharman & Howarth, 2016)

surveys (e.g., Post, 2016)

Views, attitudes

semistructured interviews (e.g., Tøsse, 2013)

surveys (e.g., Ivanova et al., 2013)

Strategic considerations

semistructured interviews (e.g., Tøsse, 2013)

survey-experiment (Post, 2016)

Advocates (e.g., politicians, interest groups)

Behavior

discourse analysis (e.g., Weingart et al., 2000)

content analysis (e.g., Hassler et al., 2016); hyperlink network analysis (e.g., Adam et al., 2016; Elgesem et al., 2015; Sharman, 2014)

Behavioral intentions/dispositions

Goals and motives

semistructured interviews (Sharman & Howarth, 2016; Hestres, 2015)

Views, attitudes

discourse analysis (e.g., Nisbet, 2014)

Strategic considerations

semistructured interviews (Hestres, 2015)

Although they have not been applied to the study of journalists’ or advocates’ climate change communication, experimental designs could be applied to test for generalizable causal relationships between actors’ motives and their communication decisions. In social scientific experiments, study participants are randomly exposed to one of several conditions that differ in only one variable—for example the tone of a message (negative, neutral, positive). By holding all other factors constant (e.g., message features, distributions of meaningful variables in the study groups, such as gender, age, education, etc.), researchers can establish whether one of the tested conditions makes a difference. In a study of journalists’ and advocates’ climate change communication, one could test, for instance, whether they find negative news on climate change more newsworthy than positive news (for social scientific experiments, see Weaver, 2008).

Another way to approximate generalizable causal relationships would be to conduct longitudinal research. One could, for instance, survey journalists or advocates at several points in time and test how their perceptions of media coverage at one point in time predict their communication intentions at another point in time. Longitudinal designs are particularly relevant when capturing the dynamics of the debate in which journalists, climate scientists, and advocates play their part and can be assumed to constantly react to one another over time. Capturing such dynamics more precisely is surely one of the major challenges of the future study of journalists’ and advocates’ communication about climate change.

References

Adam, S., Häussler, T., Schmid-Petri, H., & Reber, U. (2016). Identifying and analyzing hyperlink issue networks. In G. Vowe & P. Henn (Eds.), Political communication in the online world: Theoretical approaches and research designs (pp. 233–247). New York: Routledge.Find this resource:

Anderson, J. A. (2012). Media research methods: Understanding metric and interpretive approaches. Thousand Oaks, CA: SAGE.Find this resource:

Bell, A. (1994). Media (mis)communication on the science of climate change. Public Understanding of Science, 3(3), 259–275.Find this resource:

Berglez, P. (2011). Inside, outside, and beyond media logic: Journalistic creativity in climate reporting. Media, Culture & Society, 33(3), 449–465.Find this resource:

Blankenburg, W. B. (1970). News accuracy: Some findings on the meaning of errors. Journal of Communication, 20(4), 375–386.Find this resource:

Bourk, M., Rock, J., & Davis, L. S. (2015, October 1). Mediating the science: Symbolic and structural influences on communicating climate change through New Zealand’s television news. Environmental Communication.Find this resource:

Boykoff, M. T., & Boykoff, J. M. (2004). Balance as bias: Global warming and the US prestige press. Global Environmental Change, 14(2), 125–136.Find this resource:

Brüggemann, M., & Engesser, S. (2014). Between consensus and denial: Climate journalists as interpretive community. Science Communication, 36(4), 399–427.Find this resource:

Carvalho, A. (2005). Representing the politics of the greenhouse effect: Discursive strategies in the British media. Critical Discourse Studies, 2(1), 1–29.Find this resource:

Charnley, M. V. (1936). A study of newspaper accuracy. Journalism and Mass Communication Quarterly, 13(4), 394–401.Find this resource:

Dillman, D. A. (2000). Mail and internet surveys: The total design method. New York: Wiley.Find this resource:

Dirikx, A., & Gelders, D. (2010). Ideologies overruled? An explorative study of the link between ideology and climate change reporting in Dutch and French newspapers. Environmental Communication, 4(2), 190–205.Find this resource:

Elgesem, D., Steskal, L., & Diakopoulos, N. (2015). Structure and content of the discourse on climate change in the blogosphere: The big picture. Environmental Communication, 9(2), 169–188.Find this resource:

Elgin, D. J. (2015). Utilizing hyperlink network analysis to examine climate change supporters and opponents. Review of Policy Research, 32(2), 226–245.Find this resource:

Engesser, S., & Brüggemann, M. (2015). Mapping the minds of the mediators: The cognitive frames of climate journalists from five countries. Public Understanding of Science, 25(7), 825–841.Find this resource:

Galtung, J., & Ruge, M. H. (1965). The structure of foreign news the presentation of the Congo, Cuba and Cyprus Crises in four Norwegian newspapers. Journal of Peace Research, 2(1), 64–90.Find this resource:

Gibson, T. A., Craig, R. T., Harper, A. C., & Alpert, J. M. (2015). Covering global warming in dubious times: Environmental reporters in the new media ecosystem. Journalism, 17(4), 417–434.Find this resource:

González, A. D. L. (2014). Searching for quality: A debate among journalists, scientists and readers about the coverage of climate change in the Spanish media. Prisma Social: revista de ciencias sociales, 12, 196–231.Find this resource:

Hassler, J., Maurer, M., & Oschatz, C. (2016). So gut wie sicher? Die Darstellung der Ungewissheit klimawissenschaftlicher Erkenntnisse durch Wissenschaft, Massenmedien und Politik [Nearly certain? The presentation of the uncertainties climate scientific results in science, media and politics]. In G. Ruhrmann, S. H. Kessler, & L. Guenther (Eds.), Wissenschaftskommunikation zwischen Risiko und (Un-)Sicherheit (pp. 122–142). Cologne: Halem.Find this resource:

Henderson-Sellers, A. (1998). Climate whispers: media communication about climate change. Climatic Change, 40(3–4), 421–456.Find this resource:

Hestres, L. E. (2014). Preaching to the choir: Internet-mediated advocacy, issue public mobilization, and climate change. New Media & Society, 16(2), 323–339.Find this resource:

Hestres, L. E. (2015). Climate change advocacy online: theories of change, target audiences, and online strategy. Environmental Politics, 24(2), 193–211.Find this resource:

Hiles, S. S., & Hinnant, A. (2014). Climate change in the newsroom journalists’ evolving standards of objectivity when covering global warming. Science Communication, 36(4), 428–453.Find this resource:

Ivanova, A., Schäfer, M. S., Schlichting, I., & Schmidt, A. (2013). Is there a medialization of climate science? Results from a survey of German climate scientists. Science Communication, 35(5), 626–653.Find this resource:

Kepplinger, H. M. (2007). Reciprocal effects: Toward a theory of mass media effects on decision makers. Harvard International Journal of Press/Politics, 12(2), 3–23.Find this resource:

Kerlinger, F. N., & Lee, H. B. (2000). Foundations of behavioral research. Northridge, CA: Wadsworth, Thomson Learning.Find this resource:

Konieczna, M., Mattis, K., Tsai, J. Y., Liang, X., & Dunwoody, S. (2014). Global journalism in decision-making moments: A case study of Canadian and American television coverage of the 2009 United Nations framework convention on climate change in Copenhagen. Environmental Communication, 8(4), 489–507.Find this resource:

Krippendorff, K. (2012). Content analysis: An introduction to its methodology. Thousand Oaks, CA: SAGE.Find this resource:

León, B., & Erviti, M. C. (2015). Science in pictures: Visual representation of climate change in Spain’s television news. Public Understanding of Science, 24(2), 183–199.Find this resource:

Lewandowsky, S., Oreskes, N., Risbey, J. S., Newell, B. R., & Smithson, M. (2015). Seepage: Climate change denial and its effect on the scientific community. Global Environmental Change, 33, 1–13.Find this resource:

Lindlof, T. R., & Taylor, B. C. (2010). Qualitative communication research methods. Thousand Oaks, CA: SAGE.Find this resource:

Lück, J., Wozniak, A., & Wessler, H. (2016). Networks of coproduction: How journalists and environmental NGOs create common interpretations of the UN climate change conferences. International Journal of Press/Politics, 21(1), 25–47.Find this resource:

Matthews, J. (2015, September 7). Maintaining a politicised climate of opinion? Examining how political framing and journalistic logic combine to shape speaking opportunities in UK elite newspaper reporting of climate change. Public Understanding of Science.Find this resource:

Maxwell, J. A. (2005). Qualitative research design: An interactive approach. Thousand Oaks, CA: SAGE.Find this resource:

McCright, A. M., & Dunlap, R. E. (2000). Challenging global warming as a social problem: An analysis of the conservative movement's counter-claims. Social problems, 47(4), 499–522.Find this resource:

Newman, T. P. (2016, February 11). Tracking the release of IPCC AR5 on Twitter: Users, comments, and sources following the release of the Working Group I Summary for Policymakers. Public Understanding of Science.Find this resource:

Nisbet, M. C. (2014). Disruptive ideas: Public intellectuals and their arguments for action on climate change. Wiley Interdisciplinary Reviews: Climate Change, 5(6), 809–823.Find this resource:

Nisbet, M. C., & Kotcher, J. E. (2009). A two-step flow of influence? Opinion-leader campaigns on climate change. Science Communication, 30(3), 328–354.Find this resource:

Noelle-Neumann, E., & Petersen, T. (2013). Alle, nicht jeder: Einführung in die Methoden der Demoskopie. Heidelberg: Springer.Find this resource:

Olausson, U. (2009). Global warming—global responsibility? Media frames of collective action and scientific certainty. Public Understanding of Science, 18(4), 421–436.Find this resource:

O’Neill, S., Williams, H. T., Kurz, T., Wiersma, B., & Boykoff, M. (2015). Dominant frames in legacy and social media coverage of the IPCC Fifth Assessment Report. Nature Climate Change, 5(4), 380–385.Find this resource:

Oreskes, N. (2004). The scientific consensus on climate change. Science, 306(5702), 1686.Find this resource:

Peters, H. P. (1995). The interaction of journalists and scientific experts: Co-operation and conflict between two professional cultures. Media, Culture & Society, 17(1), 31–48.Find this resource:

Peters, H. P., & Heinrichs, H. (2005). Öffentliche Kommunikation über Klimawandel und Sturmflutrisiken: Bedeutungskonstruktion durch Experten, Journalisten und Bürger [Public communication about climate change and storm risks: construction of meaning by experts, journalists and citizens]. Jülich: Schriften des Forschungszentrum Jülich.Find this resource:

Post, S. (2008). Klimakatastrophe oder Katastrophenklima? Die Berichterstattung über den Klimawandel aus Sicht der Klimaforscher [Climate disaster or catastrophe: Climate scientists’ views on news coverage of climate change]. Baden-Baden: Nomos.Find this resource:

Post, S. (2013). Wahrheitskriterien von Journalisten und Wissenschaftlern [Criteria of Truth of Scientists and Journalists]. Baden-Baden: Nomos.Find this resource:

Post, S. (2016). Communicating science in public controversies: Strategic considerations of the German climate scientists. Public Understanding of Science, 25(1), 61–70.Find this resource:

Pulford, D. L. (1976). Follow-up of study of science news accuracy. Journalism and Mass Communication Quarterly, 53(1), 119.Find this resource:

Rice, M., Henderson-Sellers, A., & Walkerden, G. (2015). Overcoming a Diabolical Challenge: Comparing journalists’ and researchers’ views on the performance of the media as a channel of climate change information. International Journal of Science Education, Part B, 5(1), 1–22.Find this resource:

Rössler, P. (2010). Inhaltsanalyse [Content analysis]. Tübingen: UTB.Find this resource:

Saunders, C. (2007). The national and the local: Relationships among environmental movement organisations in London. Environmental Politics, 16(5), 742–764.Find this resource:

Schäfer, M. S. (2012). Online communication on climate change and climate politics: A literature review. Wiley’s Interdisciplinary Reviews (WIREs): Climate Change, 3(6), 527–543.Find this resource:

Schlembach, R. (2011). How do radical climate movements negotiate their environmental and their social agendas? A study of debates within the Camp for Climate Action (UK). Critical Social Policy, 31(2), 194–215.Find this resource:

Schlichting, I. (2013). Strategic framing of climate change by industry actors: A meta-analysis. Environmental Communication: A Journal of Nature and Culture, 7(4), 493–511.Find this resource:

Schmidt, A. (2012). Bewegungen, Gegenbewegungen, NGOs: Klimakommunikation zivilgesellschaftlicher Akteure. In M. S. Schäfer (Ed.), Das Medien-Klima (pp. 69–94). Heidelberg: Springer.Find this resource:

Schmid-Petri, H., Häussler, T., & Adam, S. (2016). Different actors, different factors? A comparison of the news factor orientation between newspaper journalists and civil-society actors. Communications: The European Journal of Communication Research, 41(4), 399–419.Find this resource:

Schreier, M. (2012). Qualitative content analysis in practice. Thousand Oaks, CA: SAGE.Find this resource:

Sharman, A. (2014). Mapping the climate sceptical blogosphere. Global Environmental Change, 26, 159–170.Find this resource:

Sharman, A., & Howarth, C. (2016, March 11). Climate stories: Why do climate scientists and sceptical voices participate in the climate debate? Public Understanding of Science.Find this resource:

Tandoc, E. C., & Takahashi, B. (2014). Playing a crusader role or just playing by the rules? Role conceptions and role inconsistencies among environmental journalists. Journalism, 15(7), 889–907.Find this resource:

Tøsse, S. E. (2013). Aiming for social or political robustness? Media strategies among climate scientists. Science Communication, 35(1), 32–55.Find this resource:

Tuchman, G. (1972). Objectivity as strategic ritual: An examination of newsmen’s notions of objectivity. American Journal of Sociology, 77(4), 660–679.Find this resource:

Weaver, J. B. (2008). Experimental design. In W. Donsbach (Ed.), The international encyclopedia of communication (pp. 1652–1658). Malden, MA: Blackwell.Find this resource:

Weingart, P. (2001). Die Stunde der Wahrheit? Zum Verhältnis der Wissenschaft zu Politik, Wirtschaft und Medien in der Wissensgesellschaft. Weilerswist: Velbrück Wissenschaft.Find this resource:

Weingart, P., Engels, A., & Pansegrau, P. (2000). Risks of communication: Discourses on climate change in science, politics, and the mass media. Public Understanding of Science, 9(3), 261–283.Find this resource:

Wessler, H., & Brüggemann, M. (2012). Transnationale Kommunikation: Eine Einführung [Transnational communication: An introduction]. Wiesbaden: Springer.Find this resource:

Wozniak, A., Wessler, H., & Lück, J. (2016). Who prevails in the visual framing contest about the united nations climate change conferences? Journalism Studies, 1–20.Find this resource:

Yin, R. K. (2003). Designing case studies. In Case study research: Design and methods (3d ed., pp. 19–56). Thousand Oaks, CA: SAGE.Find this resource:

Yin, R. K. (2013). Case study research: Design and methods. Thousand Oaks, CA: SAGE.Find this resource:

Notes:

(1.) Translated from German by the author.