Transparency is the most recently established ethical principle for professional journalists, even though its roots stretch back almost a century. The emergence of transparency as a core journalistic ethic and value has been fueled mainly by three distinct yet interdependent developments. First, sociocultural advances in society have gradually increased the availability and demand for more information, including in areas such as politics and business. This development instilled an expectation of the “right to know,” also impacting the journalistic institution. Second, the introduction of digital media technologies has provided more means to disclose information, interact with journalists, and witness news production. Third, ethical and normative discussions by journalists and scholars have promoted more openness about journalism. Transparency has frequently been advocated as an effective way to combat the ongoing decline of trust and credibility in the news media. A central rationale supporting information disclosure and providing direct access to journalists and news organizations is that the audience will be able to ascertain which journalism it can trust to be true or which journalism may be superior. Specifically, in times when the news media is being labeled as fake or lying to the public, transparency may indeed be an important mechanism for the audience to hold journalism accountable. Yet, while the promise of transparency is an enticing prospect for the journalistic institution, empirical research has not quite been able to support all the claims that transparency will indeed improve credibility and trust in the news media. However, transparency is a nascent ethic and practice in journalism, and has only recently been officially recognized. Journalists and news organizations are still in the process of finding new ways to openly engage with the public, showing them the journalistic production process and building relationships with their communities. After all, building trust takes time and may only be achieved in a continuous effort to engage in an open, honest, and personal dialogue with the people.
Transparency in Journalism
The Political Economy of Financial Markets
Federico Maria Ferrara and Thomas Sattler
The relationship between politics and financial markets is central for many, if not most, political economy arguments. The existing literature focuses on the effect of domestic and international political interests, institutions, and policy decisions on returns and volatility in stock, bond, and foreign exchange markets. This research bears implications for three major debates in political science: the distributive effects of politics, globalization and state autonomy, and the political roots of economic credibility and its tensions with democratic accountability. While the study of politics and financial markets is complicated by several theoretical and empirical challenges, recent methodological innovations in political research provide a window of opportunity for the development of the field.
Reputation in Deterrence and Decision-Making
Vesna Danilovic, Joe Clare, and Colin Tucker
Reputation in the context of international relations is an actor’s attribute as assessed by others from its past behavior. Interest in how reputational concerns can affect the dynamics of conflict arose in the canonical works by Schelling and the subsequent wave of deterrence research. Concerned primarily with the Cold War strategic issues, with some exceptions such as Huth who in 1988 analyzed it in the broader historical scope, scholarly attention to the role of reputation declined in the immediate post-Cold War period only to resurge in the 21st century. The last two decades witnessed a renaissance that resulted in unpacking the notion of reputation into several types and examining its influence in a number of areas, ranging from deterrence, compellence, to other types of militarized conflicts, civil wars, alliance choices, and sanctions, as well as issues of compliance with international commitments in institutional and cooperative studies. In such richness of research, the role of reputation in deterrence and strategic conflict in general, where it originated, still draws the largest amount of research as well as controversies. Besides several different conceptual and theoretical approaches, especially in the rationalist and psychological literature, there is methodological diversity as well, encompassing formal-theoretic models, large-N quantitative analyses, and survey experiments.
Credibility and Trust in Journalism
Questions of media trust and credibility are widely discussed; numerous studies over the past 30 years show a decline in trust in media as well as institutions and experts. The subject has been discussed—and researched—since the period between World Wars I and II and is often returned to as new forms of technology and news consumption are developed. However, trust levels, and what people trust, differ in different countries. Part of the reason that trust in the media has received such extensive attention is the widespread view shared by communications scholars and media development practitioners that a well-functioning media is essential to democracy. But the solutions discussion is further complicated because the academic research on media trust—before and since the advent of online media—is fragmented, contradictory, and inconclusive. Further, it is not clear to what extent digital technology –and the loss of traditional signals of credibility—has confused audiences and damaged trust in media and to what extent trust in media is related to worries about globalization, job losses, and economic inequality. Nor is it clear whether trust in one journalist or outlet can be generalized. This makes it difficult to know how to rebuild trust in the media, and although there are many efforts to do so, it is not clear which will work—or whether any will.
Fake news is not new, but the American presidential election in 2016 placed the phenomenon squarely onto the international agenda. Manipulation, disinformation, falseness, rumors, conspiracy theories—actions and behaviors that are frequently associated with the term—have existed as long as humans have communicated. Nevertheless, new communication technologies have allowed for new ways to produce, distribute, and consume fake news, which makes it harder to differentiate what information to trust. Fake news has typically been studied along four lines: Characterization, creation, circulation, and countering. How to characterize fake news has been a major concern in the research literature, as the definition of the term is disputed. By differentiating between intention and facticity, researchers have attempted to study different types of false information. Creation concerns the production of fake news, often produced with either a financial, political, or social motivation. The circulation of fake news refers to the different ways false information has been disseminated and amplified, often through communication technologies such as social media and search engines. Lastly, countering fake news addresses the multitude of approaches to detect and combat fake news on different levels, from legal, financial, and technical aspects to individuals’ media and information literacy and new fact-checking services.
Native advertising has become an increasingly important revenue component for many online journalism publications. Because Web consumers engage in advertising avoidance strategies when using the Web, advertisers have gradually come to rely increasingly on paid advertising that resembles in format, appearance, and content non-advertising content on websites. On news websites, native advertising forms include sponsored content, sponsored homepage links, and sponsored article-referral links. The spread of native advertising news content has led to concern that news consumers fail to recognize it as advertising, and questions about whether it is unethical or deceptive. Contemporary native advertising is not the first content delivered alongside news that blurs the boundaries between editorial and paid promotional content. Print advertorials, which took root in newspapers and magazines in the mid-20th century, are a direct analogue, but host-read ads on radio and television programs, text-based search engine result advertising, and newspaper special advertising sections can all be seen as advertising content designed to feel like non-paid content. However, because contemporary native advertising takes so many different forms, and because practices of disclosure to the user are so varied, there has been a rise in public concern and academic inquiry into the prevalence and effects of native advertising. Native advertising on online news sites has generated a number of ethical concerns from practitioners, media critics, and consumers. On the production side, scholars and practitioners worry that the creation of content on behalf of, or in partnership with, advertisers may erode norms of editorial independence that have governed media organizations’ practices for over half a century. Others are concerned that as consumers become accustomed to seeing articles produced with advertiser input, the credibility of news organizations and trust in their non-advertising content will decrease. Perhaps most prominent have been concerns that native advertising deliberately disables consumers’ ability to recognize advertising elements on a website, rendering advertiser and publisher liable for deceiving consumers. Research on native advertising has focused primarily on understanding how consumers detect and perceive native advertising, with additional streams focused on descriptive analyses of native advertising content and practitioner perspectives. Empirical studies show that many consumers do not recognize native advertising, and that there are substantial differences in how the content is received and trusted between those who recognize it and those who do not. Scholars have also identified characteristics of content, disclosure practices, and individual characteristics that influence the likelihood of advertising recognition.
Facets of Quality in Qualitative Research
How to judge the quality of a piece of research is a question that permeates academic work. A clear conception of quality is the backbone of the design and conduct of research, as well as the basis for supervision, manuscript review, and examination papers. Qualitative methodologies justify their usefulness through the development of more sophisticated interpretations of specific study objects than can be achieved within other methodologies. However, it is difficult to arrive at a decision concerning the quality of a piece of research because there are many aspects or facets to consider. Academics develop their competence for reaching decisions through experience, coupled with taking part in ways of reasoning about quality in qualitative research. Within this reasoning process there are a number of quality facets that should be considered. Each facet is grounded in various ways in different research traditions, and each has a foundational meaning but must also be assessed in relation to limitations. From the perspective of the text as a whole, one must consider whether it shows an awareness of the consequences of the research assumptions and other aspects of internal consistency. Ethical considerations are yet another facet. From a narrower perspective, the quality of the interpretation of empirical data must be able to capture rich meaning and present it in a structured way so that the interpretation can be clearly discerned. Another facet is how the interpretation contributes to existing knowledge about the issue or phenomenon that was studied. A classic question concerns the relation between an interpretation and the empirical basis, its credibility or validity. Here there are various facets to consider: Is the interpretation well anchored in data, and how do specific data fit when they are put together? Another facet is whether the interpretation opens up new ways of understanding an issue or process and whether more sophisticated action can result from embracing the interpretation. A proper evaluation of qualitative research should consider all facets; however, the ultimate appraisal rests on professional wisdom of the judge developed through experience and participation in deliberations related to what constitutes quality in academic texts. Determination of quality in qualitative research is not a mechanical process.
Signaling in Foreign Policy
Erik A. Gartzke, Shannon Carcelli, J Andres Gannon, and Jiakun Jack Zhang
Costly signaling offers a solution to many foreign policy dilemmas. Though most commonly studied in the context of the bargaining theory of war, signaling can also play an important role in nonzero-sum interactions such as those characterized by chicken (e.g., nuclear deterrence) and the prisoner’s dilemma (e.g., tariff reductions). A rich game theoretic literature explains how actors can signal credibly in these situations. The most prominent strategies are sinking costs (actions that are costly ex ante) or tying hands (actions that are costly ex post). These strategies are theoretically elegant but have generated considerable controversy when studied empirically. One controversy concerns the existence of hand-tying domestic audience costs under different regime types. A second controversy involves the degree to which sinking costs increase or decrease the risk of war. These controversies speak to the inherent tension between theories of strategic interactions and measuring their outcomes in the foreign policy process, where some events are off the equilibrium path and thus unobserved. The limited availability of foreign policy data was a major hindrance in earlier empirical efforts. Even as the quality of this data has improved, focus has been on the outcomes of conflict (crisis onset, escalation to war, victory, defeat) rather than the strategy. This is problematic given that all crises are sequential in nature and understanding the action–reaction cycle is vital to illuminating patterns of war, capitulation, and settlement. The frontier of research in the signaling literature is in bridging this gap. The advent of big data and machine learning has enabled more systematic empirical analysis of strategic moves by various foreign policy actors, including signaling. Some researchers, such as Lindsay & Gartzke, are harnessing these new data and methods to explore the means of signaling. Other scholars are beginning to ask questions about the efficacy of public versus private signaling, the role of ambiguity, and dyadic versus multi-actor signaling. This new wave of research seeks to nudge signaling closer to the concerns of foreign policy practitioners.
Climate Change Communication in South Africa
In South Africa, one of the world’s most carbon-intense economies and a society marked by gross social inequality, climate change is not a popular topic. As of 2018, more than half of the population had never heard of climate change and only one in five South Africans believed that human activities lead to global warming. The communication of climate change in South Africa is influenced by the notorious inequality that the country still suffers decades after the apartheid regime has ended. Few South Africans are able to live a life in prosperity and security on par with life in industrialized nations, more than half of the population are considered poor, almost a third of the population are chronically unemployed, and many work for carbon-intense industries. The country’s prevalent inequality and its economic dependency on coal influence the way climate change is communicated and interpreted. Environmental NGOs, journalists, and scientists frequently set communication cues on climate change. However, their messages are largely circulated in newspapers catering to an urban and educated readership and resonate less with people living in rural areas or those who rely on employment in the coal and mining sector. In South Africa, most people hear about climate change in mass media, but journalists frequently lack the resources and training necessary to investigate climate change stories or to interact with local scientists. Environmental NGOs, in contrast, provide easily comprehendible communication cues for unspecialized journalists and often share similar worldviews and demographic backgrounds with dedicated environmental reporters. However, because Black South Africans are underrepresented among environmental journalists and because many affordable local newspapers cannot afford to hire specialized reporters, climate change is covered mostly in high-quality English-language outlets to which most people have no access. Moreover, environmental NGOs are frequently accused of prioritizing abstract ecological concerns, like climate change, over the interests of the South Africans workers, a sentiment that is informed by the country’s history of racial injustice. Counterintuitively, living in a coal area is associated with higher climate change awareness and belief, likely because coal companies and trade unions conduct awareness-raising programs among their workers and because many residents experience the adverse impact of coal mining and combustion firsthand.
Leading Through Conflict With Credibility
Thomas R. Hughes and Frank D. Davidson
Even though conflict is increasingly finding its way into school settings, there is evidence that school leaders do not view themselves as being adequately equipped to meet the growing challenges. Training on short-term approaches to dealing with immediate issues may be available to practitioners through professional development offerings, but there is more involved in successfully and sustainably dealing with conflict than getting through a tense moment. School leaders need to be able to understand the causes and complexities—as well as navigate time elements—associated with ongoing conflict that can take place at the personal as well as organizational levels. Beyond understanding these concepts, administrators themselves need to increase the capacity of their staff and their organizations to assist in their development. In addition to learning how to recognize patterns and underlying causes advancing adversity, administrators would do well to invest in long-term conflict diminishing approaches such as building trust and improving interpersonal and organizational capacity as ways to increase credibility within and outside of the school itself. Finding people who can think critically and work adaptively to solve problems could prove to be a real advantage for educational leaders who strive to reduce the stress of the workplace and create a more collegial climate within the schools they serve. Building trust and the ability to “come through” capably for others even in tough situations increases the credibility of leaders. Leading through conflict with this credibility in turn helps to sustain a positive climate in schools.
Experimentation in Physics in the 20th and 21st Centuries
Allan Franklin and Ronald Laymon
What is the general notion of pursuit that correlates to scientific experiments in physics? What are the roles of experiments in physics, the epistemology of experiments, arguments around credibility, and experimental investigations in the 20th and 21st centuries? The experimental enterprise is a complex and interdependent activity wherein experiments yield results where those results form the basis for answers to questions posed by the many uses of the experiment. It is worth examining the significance of exploratory experiments and testing theories, as experimenters often apply several strategies in arguing for the correctness of their results.
Source Credibility, Expertise, and Trust in Health and Risk Messaging
Kristin Page Hocevar, Miriam Metzger, and Andrew J. Flanagin
Our understanding and perceptions of source credibility significantly drive how we process health and risk messages, and may also influence relevant behaviors. Source credibility is believed to be impacted by both perceptions of source trustworthiness and expertise, and the effect of credibility on changes in attitudes and behavior has been studied for decades in the persuasion literature. However, how we understand and define source credibility—particularly the dimension of expertise—has changed dramatically as social media and other online platforms are increasingly used to design and disseminate health messages. While earlier definitions of source credibility relied heavily on the source’s credentials as indicators of expertise on a given topic, more recent conceptualizations must also account for expertise held by laypeople who have experience with a health concern. This shifting conceptualization of source credibility may then impact both why and when people select, as well as how they perceive, process, and judge, health messaging across both novel and more traditional communication contexts.
Qualitative Methods in Intergroup Communication
Maggie J. Pitts
A researcher’s methodological approach is guided by his or her orientation toward three major philosophical assumptions: epistemological assumptions (i.e., what the nature of truth or knowledge is and how it can be pursued), ontological assumptions (i.e., what the nature of reality is and how it can be understood), and axiological assumptions (i.e., what the researcher’s position in the world is and responsibilities to it). Qualitative inquiry is largely guided by methodological beliefs that hold truth and reality as socially constructed, that value subjectivity over objectivity, that explore questions of “how” or “why” over questions of “what,” and that value participants’ voices and experiences. Broadly, qualitative inquiry seeks to describe the world as it is experienced and lived in by the participants under study. With respect to intergroup communication, qualitative inquiry takes an in-depth approach to understanding how members of a community or culture enact the behaviors of everyday life relevant to their group. Qualitative inquiry comprises several methodologies or methodological approaches including ethnography, autoethnography, and ethnography of communication; narrative paradigm and narrative theory; grounded theory; phenomenology; and case studies. Each methodology employs one method or a combination of methods to collect qualitative data. Methods refer to the tools used to collect data for the purposes of informing research and answering research questions. Qualitative methods include tools for the collection of descriptive, largely non-numeric data, including several types of interviews, observations, and interactions, and the collection of meaningful texts, documents, and objects. The collection of qualitative data often requires the researcher to establish a trusting relationship (rapport) with participants and gain an insider’s (emic) perspective of the context for study. In many cases, this is established through prolonged engagement in the field and carefully crafting interview questions that encourage detailed disclosures. Qualitative data are analyzed through a process of dissection, up-close examination, contrast, and comparison between units of data and then putting pieces back together in a synergetic way that represents data holistically. Most qualitative data analysis involves some form of coding: a process of identifying units of data that are relevant to the research questions, assigning them a short label or code, then clustering similar codes into increasingly abstract thematic categories. Researchers establish trustworthiness in qualitative reports through descriptive writing that preserves the voices of the participants, that reflects the social realities of the participants, and that contextualizes results within broader scholarly discourse by tying findings to previous theory or research. Qualitative research reports can take many forms that range from creative forms of writing and representation including poetry and photographs to more conventional forms of writing that fit expectations of social scientific academic journals. When applied to intergroup contexts, qualitative inquiry can make evident the language and communication patterns and social behaviors that distinguish one group from another. Field observations can reveal identity performance and group behavior. Interviews can solicit information from participants about in-group or out-group perceptions and experiences. And the collection and analysis of texts and documents can establish the means through which group identity is preserved and transferred.
Patrick M. Morgan
Nuclear strategy involves the production of nuclear weapons for political ends as well as the goals, means, and ways in which they are, or are planned to be, used. The roots of nuclear strategy can be traced to World War II, when nuclear scientists, as well as American and British high-level officials, began thinking about how nuclear weapons could be harnessed. Several ideas then emerged that became central to nuclear strategy, but largely ignored in early postwar American military planning. Aside from war-fighting, the United States’s grand strategy and national security policy soon focused on containment as the way to deal with communism around the world. Containment was politically and intellectually well-suited for emphasizing nuclear deterrence as a means of preventing the Cold War from escalating into war. The theory and resulting strategy was dominated by two problems: the stability problem and the credibility problem. As for actually fighting a nuclear war, strategies include demonstration explosions to curb enemy military actions, preventive and preemptive attacks, and retaliation after being attacked. The design and implementation of nuclear postures and strategies have been beset by numerous deficiencies, such as accidents with nuclear weapons and delivery systems. Fortunately, nuclear strategy did not give rise to what many feared—a self-sustaining security dilemma that made insecurity overwhelming and impossible to dispel.
Perfect Deterrence Theory
Frank C. Zagare
Perfect deterrence theory and classical deterrence theory are two theoretical frameworks that have divergent empirical implications and dissimilar policy recommendations. In perfect deterrence theory, threat credibility plays a central role in the operation of both direct and extended deterrence relationships. But credible threats are neither necessary nor sufficient for deterrence to prevail, and under certain conditions, the presence of a credible threat may actually undermine deterrence. In perfect deterrence theory, the cost of conflict and status quo evaluations are also important strategic variables. Classical deterrence theorists tend to fixate on the former and ignore the latter. This theoretical oversight precludes a nuanced understanding of the dynamics of deterrence.
The Concept of Deterrence and Deterrence Theory
Patrick M. Morgan
Deterrence is an old practice, readily defined and described, widely employed but unevenly effective and of questionable reliability. Elevated to prominence after World War II and the arrival of nuclear weapons, deterrence became the central recourse for sustaining international and internal security and stability among and within states in an era of serious conflict. With regard to the presence of nuclear weapons in particular but also to deal with non-nuclear violent conflict, deterrence has been employed to prevent (or at least limit) the destruction of states, societies, and ultimately humanity. The greatest success has been that no nuclear weapons have been used for destructive purposes since the end of World War II in 1945. Deterrence has been widely used below the nuclear level but with very uneven results. Deterrence has been intensively studied and tested as to its use in terms of strategy in international relations, the maintenance of stability in international relations, the conduct of violence and warfare in both international and domestic contexts, and in political affairs. Since deterrence is the use of threats to block or reduce the inflicting of serious harm, the existence of capacities for inflicting harm are readily maintained and periodically applied, so available deterrence capabilities provide a degree of continuing concern and a regular desire to at least do away with nuclear weapons and threats. A brief period in the ending of the Cold War saw a serious effort to reduce the reliance on deterrence, particularly nuclear deterrence, in international politics but it was soon replaced by serious movement in the opposite direction. Yet efforts to reduce the need for and use of deterrence continue. Extensive efforts have been applied in the development of theories of deterrence, particularly to generate empirical theory in order to better understand and apply deterrence but without arriving at widely accepted results. This is the result of the considerable complexity of the subject, the activity involved, and the behavior of the practitioners. The conduct of deterrence is now broader and deeper than before. It is under greater pressure due to technological, political, and cultural developments, and operates in a much more elaborate overall environment including space, cyberspace, and oceanic environs. Thus the goal of developing effective empirical theory on deterrence remains, at various levels, still incompletely attained. The same is true of mastering deterrence in practice. Nevertheless, deterrence remains important and fascinating.