41-60 of 130 Results  for:

  • Foreign Relations and Foreign Policy x
Clear all

Article

Philippe R. Girard

Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.

Article

Sworn in as the 33rd President of the United States following Franklin D. Roosevelt’s death in April 1945, Harry S. Truman faced the daunting tasks of winning the war and ensuring future peace and stability. Chided by critics for his lack of foreign policy experience but championed by supporters for his straightforward decision-making, Truman guided the United States from World War to Cold War. The Truman presidency marked a new era in American foreign relations, with the United States emerging from World War II unmatched in economic strength and military power. The country assumed a leadership position in a postwar world primarily shaped by growing antagonism with the Soviet Union. Truman pursued an interventionist foreign policy that took measures to contain Soviet influence in Europe and stem the spread of communism in Asia. Under his leadership, the United States witnessed the dawn of the atomic age, approved billions of dollars in economic aid to rebuild Europe, supported the creation of multilateral organizations such as the United Nations and North Atlantic Treaty Organization, recognized the state of Israel, and intervened in the Korean peninsula. The challenges Truman confronted and the policies he implemented laid the foundation for 20th-century US foreign relations throughout the Cold War and beyond.

Article

Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.

Article

In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.

Article

A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century. Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations. The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.

Article

Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.

Article

Tanvi Madan

Policymakers and analysts have traditionally described US relations with India as moving from estrangement during the Cold War and immediate post–Cold War period to engagement after 1999. The reality has been more complex, interspersing periods of estrangement, indifference, and engagement, with the latter dominating the first two decades of the 21st century. The nature of the relationship has been determined by a variety of factors and actors, with American perceptions of India shaped by strategic and economic considerations as well as the exchange of ideas and people. The overall state of the US relationship with India after 1947 has been determined by where that country has fit into Washington’s strategic framework, and Delhi’s ability and willingness to play the role envisioned for it. When American and Indian policymakers have seen the other country as important and useful, they have sought to strengthen US-India relations. In those periods, they have also been more willing to manage the differences that have always existed between the two countries at the global, regional, and bilateral levels. But when strategic convergence between the two countries is missing, differences have taken center stage.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

The US relationship with the Republic of Indonesia has gone through three distinct phases. From 1945 until 1966 Indonesia’s politics and foreign policy were driven by the imperatives of decolonization and nation building, dominated by its founding President Sukarno and cleaved by bitter rivalry between secular political forces, regional movements, Islamic parties and organizations, the Indonesian Communist Party (PKI), and the armed forces. In the aftermath of the September 30th Movement, an alleged coup by the PKI (the Indonesian Communist Party), under the leadership of General Suharto, launched a campaign of mass murder in which hundreds of thousands of alleged Communists were killed and Sukarno ousted. Suharto would rule Indonesia for the next thirty-two years (1966 to 1998). With the Cold War inside Indonesia effectively over and a staunchly anti-Communist and pro-US regime in power, US-Indonesian relations entered a long period of what one might call authoritarian development in which US officials focused on political stability, supported the military’s heavy involvement in politics, encouraged pro-Western investment and development policies, and sought to downplay growing criticism of Suharto’s abysmal record on human rights, democracy, corruption, and the environment. The end of the Cold War reduced the strategic imperative of backing authoritarian rule in Indonesia, and over the course of the 1990s domestic opposition to Suharto steadily built among moderate Islamic forces, human rights and women’s activists, environmental campaigners, and a burgeoning pro-democracy movement. The Asian financial crisis, which began in the summer of 1997, accelerated the forces undermining Suharto’s rule, forcing his resignation in May 1998 and inaugurating a third phase of formally democratic politics, which continues to the 21st century. Since 1998 US policy has focused on regional economic and security cooperation, counterterrorism, trade relations, and countering the growing regional power of China.

Article

Thomas A. Reinstein

The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community. At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.

Article

International law is the set of rules, formally agreed by treaty or understood as customary, by which nation-states interact with each other in a form of international society. Across the history of U.S. foreign relations, international law has provided both an animating vision, or ideology, for various American projects of world order, and a practical tool for the advancement of U.S. power and interests. As the American role in the world changed since the late 18th century, so too did the role of international law in U.S. foreign policy. Initially, international law was a source of authority to which the weak American government could appeal on questions of independence, sovereignty, and neutrality. As U.S. power grew in the 19th and early 20th centuries, international law became variously a liberal project for the advancement of peace, a civilizational discourse for justifying violence and dispossession, and a bureaucratic and commercial tool for the expansion of empire. With the advent of formal inter-governmental organizations in the 20th century, the traditional American focus on neutrality faded, to be replaced by an emphasis on collective security. But as the process of decolonization diluted the strength of the United States and its allies in the parliamentary chambers of the world’s international organizations, Washington increasingly advanced its own interpretations of international law, and opted out of a number of international legal regimes. At the same time, Americans increasingly came to perceive of international law as a vehicle to advance the human rights of individuals over the sovereign rights of states.

Article

Mary S. Barton and David M. Wight

The US government’s perception of and response to international terrorism has undergone momentous shifts since first focusing on the issue in the early 20th century. The global rise of anarchist and communist violence provided the impetus for the first major US government programs aimed at combating international terrorism: restrictive immigration policies targeting perceived radicals. By the 1920s, the State Department emerged as the primary government agency crafting US responses to international terrorism, generally combating communist terrorism through diplomacy and information-sharing partnerships with foreign governments. The 1979 Iranian hostage crisis marked the beginning of two key shifts in US antiterrorism policy: a heightened focus on combating Islamist terrorism and a willingness to deploy military force to this end. The terrorist attacks of September 11, 2001, led US officials to conceptualize international terrorism as a high-level national security problem, leading to US military invasions and occupations of Afghanistan and Iraq, a broader use of special forces, and unprecedented intelligence-gathering operations.

Article

Malcolm Byrne

Iran-Contra was a major political scandal in the late 1980s that nearly derailed a popular president and left American society deeply divided about its significance. Although the affair was initially portrayed as a rogue operation run by overzealous White House aides, subsequent evidence showed that the president himself was its driving force with the knowledge of his most senior advisers. Iran-Contra was a foreign policy scandal, but it also gave rise to a significant confrontation between the executive and legislative branches with constitutional implications for their respective roles, especially in foreign policy. The affair exposed significant limits on the ability of all three branches to ferret out and redress official wrongdoing. And the entire episode, a major congressional investigation concluded, was characterized by a remarkable degree of dishonesty and deception, reaching to the highest levels of government. For all these reasons, and in the absence of a clear legal or ethical conclusion (in contrast to Watergate), Iran-Contra left a scar on the American body politic that further eroded the public’s faith in government.

Article

Kelly J. Shannon

Historian James A. Bill famously described America’s relationship with Iran as a tragedy. “Few international relationships,” he wrote, “have had a more positive beginning than that which characterized Iranian-American contacts for more than a century.” The nations’ first diplomatic dealings in the 1850s resulted in a treaty of friendship, and although the U.S. government remained largely aloof from Iranian affairs until World War II, many Iranians saw Americans and the United States positively by the early 20th century. The United States became more deeply involved with Iran during the Second World War, and the two nations were close allies during the Cold War. Yet they became enemies following the 1979 Iranian Revolution. How did this happen? The events that led to the Islamic Republic of Iran dubbing the United States the “Great Satan” in 1979 do indeed contain elements of tragedy. By the late 19th century, Iran—known to Americans as “Persia” until the 1930s—was caught in the middle of the imperial “Great Game” between Great Britain and Russia. Although no European power formally colonized Iran, Britain and Russia developed “spheres of influence” in the country and meddled constantly in Iran’s affairs. As Iranians struggled to create a modern, independent nation-state, they looked to disinterested third parties for help in their struggle to break free from British and Russian control. Consequently, many Iranians came to see the United States as a desirable ally. Activities of individual Americans in Iran from the mid-19th century onward, ranging from Presbyterian missionaries who built hospitals and schools to economic experts who advised Iran’s government, as well as the United States’ own revolutionary and democratic history, fostered a positive view of the United States among Iranians. The two world wars drew the United States into more active involvement in the Middle East, and following both conflicts, the U.S. government defended Iran’s sovereignty against British and Soviet manipulation. The event that caused the United States to lose the admiration of many Iranians occurred in 1953, when the U.S. Central Intelligence Agency and the British Secret Intelligence Service staged a coup, which overthrew Iran’s democratically elected prime minister, Mohammad Mossadegh, because he nationalized Iran’s oil industry. The coup allowed Iran’s shah, Mohammad Reza Shah Pahlavi, to transform himself from a constitutional monarch into an absolute ruler. The 1953 coup, coupled with the subsequent decades of U.S. support for the Shah’s politically repressive regime, resulted in anti-American resentment that burst forth during the 1979 Iranian Revolution. The two nations have been enemies ever since. This article traces the origins and evolution of the U.S. relationship with Iran from the 19th through the early 21st centuries.

Article

Sophie Cooper

Irish and American histories are intertwined as a result of migration, mercantile and economic connections, and diplomatic pressures from governments and nonstate actors. The two fledgling nations were brought together by their shared histories of British colonialism, but America’s growth as an imperial power complicated any natural allegiances that were invoked across the centuries. Since the beginnings of that relationship in 1607 with the arrival of Irish migrants in America (both voluntary and forced) and the building of a transatlantic linen trade, the meaning of “Irish” has fluctuated in America, mirroring changes in both migrant patterns and international politics. The 19th century saw Ireland enter into Anglo-American diplomacy on both sides of the Atlantic, while the 20th century saw Ireland emerge from Britain’s shadow with the establishment of separate diplomatic connections between the United States and Ireland. American recognition of the newly independent Irish Free State was vital for Irish politicians on the world stage; however the Free State’s increasingly isolationist policies during the 1930s to 1950s alienated its American allies. The final decade of the century, however, brought America and Ireland (including both Northern Ireland and the Republic of Ireland) closer than ever before. Throughout their histories, the Irish diasporas—both Protestant and Catholic—in America have played vital roles as pressure groups and fundraisers. The history of American–Irish relations therefore brings together governmental and nonstate organizations and unites political, diplomatic, social, cultural, and economic histories which are still relevant today.

Article

Justus D. Doenecke

For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements. Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy. If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.

Article

Olivia L. Sohns

Moral, political, and strategic factors have contributed to the emergence and durability of the U.S.-Israel alliance. It took decades for American support for Israel to evolve from “a moral stance” to treating Israel as a “strategic asset” to adopting a policy of “strategic cooperation.” The United States supported Israel’s creation in 1948 not only because of the lobbying efforts of American Jews but also due to humanitarian considerations stemming from the Holocaust. Beginning in the 1950s, Israel sought to portray itself as an ally of the United States on grounds that America and Israel were fellow liberal democracies and shared a common Judeo-Christian cultural heritage. By the mid-1960s, Israel was considered a strategic proxy of American power in the Middle East in the Cold War, while the Soviet Union armed the radical Arab nationalist states and endorsed a Palestinian “people’s wars of national liberation” against Israel. Over the subsequent decades, Israel repeatedly sought to demonstrate that it was allied with the United States in opposing instability in the region that might threaten U.S. interests. Israel also sought to portray itself as a liberal democracy despite its continued occupation of territories that it conquered in the Arab-Israeli War of 1967. After the terrorist attacks of September 11, 2001, and the rise of regional instability and radicalism in the Middle East following the 2003 U.S. invasion of Iraq and the Arab Spring of 2011, Israel’s expertise in the realms of counterterrorism and homeland security provided a further basis for U.S.-Israel military-strategic cooperation. Although American and Israeli interests are not identical, and there have been disagreements between the two countries regarding the best means to secure comprehensive Arab-Israeli and Israeli-Palestinian peace, the foundations of the relationship are strong enough to overcome crises that would imperil a less robust alliance.

Article

Jennifer M. Miller

Over the past 150 years, the United States and Japan have developed one of the United States’ most significant international relationships, marked by a potent mix of cooperation and rivalry. After a devastating war, these two states built a lasting alliance that stands at the center of US diplomacy, security, and economic policy in the Pacific and beyond. Yet this relationship is not simply the product of economic or strategic calculations. Japan has repeatedly shaped American understandings of empire, hegemony, race, democracy, and globalization, because these two states have often developed in remarkable parallel with one another. From the edges of the international order in the 1850s and 1860s, both entered a period of intense state-building at home and imperial expansion abroad in the late 19th and early 20th centuries. These imperial ambitions violently collided in the 1940s in an epic contest to determine the Pacific geopolitical order. After its victory in World War II, the United States embarked on an unprecedented occupation designed to transform Japan into a stable and internationally cooperative democracy. The two countries also forged a diplomatic and security alliance that offered crucial logistical, political, and economic support to the United States’ Cold War quest to prevent the spread of communism. In the 1970s and 1980s, Japan’s rise as the globe’s second-largest economy caused significant tension in this relationship and forced Americans to confront the changing nature of national power and economic growth in a globalizing world. However, in recent decades, rising tensions in the Asia-Pacific have served to focus this alliance on the construction of a stable trans-Pacific economic and geopolitical order.

Article

John Quincy Adams was one of the most significant statesmen-intellectuals of the Early American Republic. Highly intelligent, well-traveled, and massively educated, Adams was a Christian nationalist who believed that the American Republic was destined to be a shining example of democracy and liberty to the rest of the world. He was profoundly influenced by his parents, John and Abigail, and embraced his father’s political philosophy which was rooted in a written constitution and a strong three branch government constrained by checks and balances. Adams served as US minister to several European nations before becoming secretary of state in 1817 and then the sixth president of the United States in 1824. He began life as a Federalist but strongly supported the foreign policies of the Jefferson and Madison administrations. The three pillars of his foreign policy were neutrality toward Europe, continental expansion, and hemispheric hegemony. Adams chaired the US delegation that negotiated the Treaty of Ghent in 1814 and was the driving force behind the Convention of 1818 and the Transcontinental Treaty of 1819. Adams partnered with President James Monroe in formulating the Monroe Doctrine in 1823, which canonized the principles of the two hemispheres including European non-colonization in the Western hemisphere and US non-interference in European affairs. Domestically, Adams was a relentless exponent of the American System in which the federal government would fund a system of internal improvements—turnpikes, canals, ports—that would create a national market and bind the various regions together by means of a national economy. In this, he was disappointed in part because he had the misfortune to be president when Jacksonian democracy was taking hold in America and distrust of the federal power was growing. Defeated for re-election by Andrew Jackson in 1828, Adams briefly retired from public life but then accepted election to the House of Representatives in 1830 where he served until his death in 1846. In the House, he proved to be an avid opponent of the further extension of slavery into the territories, and ironically, of further continental expansion. He became convinced that a civil war was inevitable but held abolitionists at arm’s length because of their rejection of the Constitution as a means to achieve racial justice in America. Adams died with a deep sense of failure, believing that his earlier career as an expansionist had produced not an empire of liberty but an empire of slavery.

Article

Since the late 19th century, the relationship between journalists and the makers of US foreign policy has been both cooperative and contentious. Reporters depend on government officials for information about policy decisions and their implementation. The White House, the State Department, and the Pentagon use the news media to build support for their policies and, at times, to communicate directly with allies and adversaries. Since World War I, presidential administrations have developed increasingly sophisticated methods to manage the news and influence public understanding of international affairs. Wartime censorship has been one tool of news management. Self-censorship, however, has also affected coverage of international affairs, as journalists have voluntarily refrained from publishing information for fear of impairing national security or undermining support for US wartime or Cold War policies. Allegations of bias and sensationalism became acrimonious during the Vietnam War and have continued to shape the debate about accurate, critical, and legitimate reporting. Arguments over “fake news,” which became commonplace during the presidency of Donald J. Trump, have many precursors, as both journalists and government officials have been responsible for misleading or distorted news coverage of international affairs since the Spanish–American War.