1-20 of 23 Results

  • Keywords: diplomacy x
Clear all

Article

Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

Daniel Sargent

Foreign economic policy involves the mediation and management of economic flows across borders. Over two and a half centuries, the context for U.S. foreign economic policy has transformed. Once a fledgling republic on the periphery of the world economy, the United States has become the world’s largest economy, the arbiter of international economic order, and a predominant influence on the global economy. Throughout this transformation, the making of foreign economic policy has entailed delicate tradeoffs between diverse interests—political and material, foreign and domestic, sectional and sectoral, and so on. Ideas and beliefs have also shaped U.S. foreign economic policy—from Enlightenment-era convictions about the pacifying effects of international commerce to late 20th-century convictions about the efficacy of free markets.

Article

From the revolutionary era to the post-9/11 years, public and private actors have attempted to shape U.S. foreign relations by persuading mass audiences to embrace particular policies, people, and ways of life. Although the U.S. government conducted wartime propaganda activities prior to the 20th century, it had no official propaganda agency until the Committee on Public Information (CPI) was formed in 1917. For the next two years, CPI aimed to generate popular support for the United States and its allies in World War I. In 1938, as part of its Good Neighbor Policy, the Franklin Roosevelt administration launched official informational and cultural exchanges with Latin America. Following American entry into World War II, the U.S. government created a new propaganda agency, the Office of War Information (OWI). Like CPI, OWI was disbanded once hostilities ended. But in the fall of 1945, to combat the threats of anti-Americanism and communism, President Harry S. Truman broke with precedent and ordered the continuation of U.S. propaganda activities in peacetime. After several reorganizations within the Department of State, all U.S. cultural and information activities came under the purview of the newly created U.S. Information Agency (USIA) in 1953. Following the dissolution of USIA in 1999, the State Department reassumed authority over America’s international information and cultural programs through its Office of International Information Programs.

Article

America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.

Article

Public opinion has been part of US foreign relations in two key ways. As one would expect in a democracy, the American public has shaped the foreign policy of its government. No less significantly, the United States has sought to influence foreign public opinion as a tool of its diplomacy, now known as public diplomacy. The US public has also been a target of foreign attempts at influence with varying degrees of success. While analysis across the span of US history reveals a continuity of issues and approaches, issues of public opinion gained unprecedented salience in the second decade of the 21st century. This salience was not matched by scholarship.

Article

Jennifer M. Miller

Over the past 150 years, the United States and Japan have developed one of the United States’ most significant international relationships, marked by a potent mix of cooperation and rivalry. After a devastating war, these two states built a lasting alliance that stands at the center of US diplomacy, security, and economic policy in the Pacific and beyond. Yet this relationship is not simply the product of economic or strategic calculations. Japan has repeatedly shaped American understandings of empire, hegemony, race, democracy, and globalization, because these two states have often developed in remarkable parallel with one another. From the edges of the international order in the 1850s and 1860s, both entered a period of intense state-building at home and imperial expansion abroad in the late 19th and early 20th centuries. These imperial ambitions violently collided in the 1940s in an epic contest to determine the Pacific geopolitical order. After its victory in World War II, the United States embarked on an unprecedented occupation designed to transform Japan into a stable and internationally cooperative democracy. The two countries also forged a diplomatic and security alliance that offered crucial logistical, political, and economic support to the United States’ Cold War quest to prevent the spread of communism. In the 1970s and 1980s, Japan’s rise as the globe’s second-largest economy caused significant tension in this relationship and forced Americans to confront the changing nature of national power and economic growth in a globalizing world. However, in recent decades, rising tensions in the Asia-Pacific have served to focus this alliance on the construction of a stable trans-Pacific economic and geopolitical order.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

From its inception as a nation in 1789, the United States has engaged in an environmental diplomacy that has included attempts to gain control of resources, as well as formal diplomatic efforts to regulate the use of resources shared with other nations and peoples. American environmental diplomacy has sought to gain control of natural resources, to conserve those resources for the future, and to protect environmental amenities from destruction. As an acquirer of natural resources, the United States has focused on arable land as well as on ocean fisheries, although around 1900, the focus on ocean fisheries turned into a desire to conserve marine resources from unregulated harvesting. The main 20th-century U.S. goal was to extend beyond its borders its Progressive-era desire to utilize resources efficiently, meaning the greatest good for the greatest number for the longest time. For most of the 20th century, the United States was the leader in promoting global environmental protection through the best science, especially emphasizing wildlife. Near the end of the century, U.S. government science policy was increasingly out of step with global environmental thinking, and the United States often found itself on the outside. Most notably, the attempts to address climate change moved ahead with almost every country in the world except the United States. While a few monographs focus squarely on environmental diplomacy, it is safe to say that historians have not come close to tapping the potential of the intersection of the environmental and diplomatic history of the United States.

Article

Best known as Abraham Lincoln’s secretary of state during the Civil War, William Henry Seward conducted full careers as a statesman, politician, and visionary of America’s future, both before and after that traumatic conflict. His greatest legacy, however, lay in his service as the secretary of state, leading the diplomatic effort to prevent European intervention in the conflict. His success in that effort marked the margin between the salvation and the destruction of the Union. Beyond his role as diplomat, Seward’s signature qualities of energy, optimism, ambition, and opportunism enabled him to assume a role in the Lincoln administration extending well beyond his diplomatic role as the secretary of state. Those same qualities secured a close working relationship with the president as Seward overcame a rocky first few weeks in office to become Lincoln’s confidant and sounding board. Seward’s career in politics stretched from the 1830s until 1869. Through that time, he maintained a vision of a United States of America built on opportunity and free labor, powered by government’s active role in internal improvement and education. He foresaw a nation fated to expand across the continent and overseas, with expansion occurring peacefully as a result of American industrial and economic strength and its model of government. During his second term as secretary of state, under the Johnson administration, Seward attempted a series of territorial acquisitions in the Caribbean, the Pacific, and on the North American continent. The state of the post-war nation and its fractious politics precluded success in most of these attempts, but Seward was successful in negotiating and securing Congressional ratification of the purchase of Alaska in 1867. In addition, Seward pursued a series of policies establishing paths followed later by US diplomats, including the open door in China and the acquisition of Hawaii and US naval bases in the Caribbean.

Article

Sophie Cooper

Irish and American histories are intertwined as a result of migration, mercantile and economic connections, and diplomatic pressures from governments and nonstate actors. The two fledgling nations were brought together by their shared histories of British colonialism, but America’s growth as an imperial power complicated any natural allegiances that were invoked across the centuries. Since the beginnings of that relationship in 1607 with the arrival of Irish migrants in America (both voluntary and forced) and the building of a transatlantic linen trade, the meaning of “Irish” has fluctuated in America, mirroring changes in both migrant patterns and international politics. The 19th century saw Ireland enter into Anglo-American diplomacy on both sides of the Atlantic, while the 20th century saw Ireland emerge from Britain’s shadow with the establishment of separate diplomatic connections between the United States and Ireland. American recognition of the newly independent Irish Free State was vital for Irish politicians on the world stage; however the Free State’s increasingly isolationist policies during the 1930s to 1950s alienated its American allies. The final decade of the century, however, brought America and Ireland (including both Northern Ireland and the Republic of Ireland) closer than ever before. Throughout their histories, the Irish diasporas—both Protestant and Catholic—in America have played vital roles as pressure groups and fundraisers. The history of American–Irish relations therefore brings together governmental and nonstate organizations and unites political, diplomatic, social, cultural, and economic histories which are still relevant today.

Article

Americans in and out of government have relied on media and popular culture to construct the national identity, frame debates on military interventions, communicate core values abroad, and motivate citizens around the world to act in prescribed ways. During the late 19th century, as the United States emerged as a world power and expanded overseas, Americans adopted an ethos of worldliness in their everyday lives, even as some expressed worry about the nation’s position on war and peace. During the interwar period of the 1920s and 1930s, though America failed to join the League of Nations and retreated from foreign engagements, the nation also increased cultural interactions with the rest of the world through the export of motion pictures, music, consumer products, food, fashion, and sports. The policies and character of the Second World War were in part shaped by propaganda that evolved from earlier information campaigns. As the United States confronted communism during the Cold War, the government sanitized its cultural weapons to win the hearts and minds of Americans, allies, enemies, and nonaligned nations. But some cultural producers dissented from America’s “containment policy,” refashioned popular media for global audiences, and sparked a change in Washington’s cultural-diplomacy programs. An examination of popular culture also shows how people in the “Third World” deftly used the media to encourage superpower action. In the 21st century, activists and revolutionaries can be considered the inheritors of this tradition because they use social media to promote their political agendas. In short, understanding the roles popular culture played as America engaged the world greatly expands our understanding of modern American foreign relations.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Euro-Americans existed firmly on the periphery of an Indigenous North America in 1763, hubristic claims of continental sovereignty notwithstanding. Nowhere is this reality more clear than in the Ohio Valley and Illinois Country. Try as it might, the post-1763 British Empire could not assume jurisdictional control over this space. Even to begin to try was a task requiring significant investment—both in terms of more systematic Indigenous diplomacy and in terms of reforming colonial political structures unfit to accommodate imperial western policy. North American officials understood the problems quite well and were willing to spearhead reform. Between 1763 and 1775 they supported increased investment to defray North American expenses. They called for programs that would end colonial corruption, something they feared undermined Indigenous diplomacy and made a mockery of the rule of law. Ultimately, they concluded that centralizing Indian affairs offered the best means by which to stabilize North America. Colonials (generally) and speculators and their surveyor corps (specifically) powerfully disagreed, however, seeing Indian country as an untapped resource and imperial restraints as threats to local autonomy. They rejected the idea of centralizing power over Indigenous affairs and used the rhetoric of British constitutional liberty to reframe corrupt behavior into something it emphatically was not.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history. Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.

Article

Spencer D. Bakich

The Persian Gulf War of 1990–1991 was something of a paradox. From the American perspective, the war had the hallmarks of a resounding victory. Responding to a flagrant case of interstate aggression by Iraq against Kuwait, the George H. W. Bush administration assembled a substantial international coalition to deter further Iraqi attacks against its neighbors in the Gulf and to compel Saddam Hussein into quitting Kuwait, to avoid war. When the latter proved infeasible, the United States led that coalition in forcibly ousting Iraq’s military from Kuwait, substantially degrading Iraqi combat power in the process. The war’s outcome resulted from an auspiciously altered geopolitical landscape at the end of the Cold War, the overwhelming superiority of American power vis-à-vis Iraq, and a US decision-making process that tightly knitted military and diplomatic objectives into a coherent—and coherently executed—wartime strategy. However, America’s historically lopsided victory in the Persian Gulf War proved fleeting. Iraq’s surviving military forces retained the capacity to crush domestic challenges to the Ba’athist regime and to threaten its Gulf neighbors. President Bush’s vision of a post-war new world order notwithstanding, Gulf security depended heavily on continuing military missions years after the Persian Gulf War ended. Despite wartime tactical and strategic successes, grand strategic success eluded the United States in the years after the war.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

Evan D. McCormick

Since gaining independence in 1823, the states comprising Central America have had a front seat to the rise of the United States as a global superpower. Indeed, more so than anywhere else, the United States has sought to use its power to shape Central America into a system that heeds US interests and abides by principles of liberal democratic capitalism. Relations have been characterized by US power wielded freely by officials and non-state actors alike to override the aspirations of Central American actors in favor of US political and economic objectives: from the days of US filibusterers invading Nicaragua in search of territory; to the occupations of the Dollar Diplomacy era, designed to maintain financial and economic stability; to the covert interventions of the Cold War era. For their part, the Central American states have, at various times, sought to challenge the brunt of US hegemony, most effectively when coordinating their foreign policies to balance against US power. These efforts—even when not rejected by the United States—have generally been short-lived, hampered by economic dependency and political rivalries. The result is a history of US-Central American relations that wavers between confrontation and cooperation, but is remarkable for the consistency of its main element: US dominance.

Article

The Soviet Union’s successful launch of the first artificial satellite Sputnik 1 on October 4, 1957, captured global attention and achieved the initial victory in what would soon become known as the space race. This impressive technological feat and its broader implications for Soviet missile capability rattled the confidence of the American public and challenged the credibility of U.S. leadership abroad. With the U.S.S.R.’s launch of Sputnik, and then later the first human spaceflight in 1961, U.S. policymakers feared that the public and political leaders around the world would view communism as a viable and even more dynamic alternative to capitalism, tilting the global balance of power away from the United States and towards the Soviet Union. Reactions to Sputnik confirmed what members of the U.S. National Security Council had predicted: the image of scientific and technological superiority had very real, far-reaching geopolitical consequences. By signaling Soviet technological and military prowess, Sputnik solidified the link between space exploration and national prestige, setting a course for nationally funded space exploration for years to come. For over a decade, both the Soviet Union and the United States funneled significant financial and personnel resources into achieving impressive firsts in space, as part of a larger effort to win alliances in the Cold War contest for global influence. From a U.S. vantage point, the space race culminated in the first Moon landing in July 1969. In 1961, President John F. Kennedy proposed Project Apollo, a lunar exploration program, as a tactic for restoring U.S. prestige in the wake of Soviet cosmonaut Yuri Gagarin’s spaceflight and the failure of the Bay of Pigs invasion. To achieve Kennedy’s goal of sending a man to the Moon and returning him safely back to Earth by the end of the decade, the United States mobilized a workforce in the hundreds of thousands. Project Apollo became the most expensive government funded civilian engineering program in U.S. history, at one point stretching to more than 4 percent of the federal budget. The United States’ substantial investment in winning the space race reveals the significant status of soft power in American foreign policy strategy during the Cold War.