1-20 of 64 Results  for:

  • 20th Century: Post-1945 x
  • Foreign Relations and Foreign Policy x
Clear all

Article

Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.

Article

American activists who challenged South African apartheid during the Cold War era extended their opposition to racial discrimination in the United States into world politics. US antiapartheid organizations worked in solidarity with forces struggling against the racist regime in South Africa and played a significant role in the global antiapartheid movement. More than four decades of organizing preceded the legislative showdown of 1986, when a bipartisan coalition in Congress overrode President Ronald Reagan’s veto, to enact economic sanctions against the apartheid regime in South Africa. Adoption of sanctions by the United States, along with transnational solidarity with the resistance to apartheid by South Africans, helped prompt the apartheid regime to relinquish power and allow the democratic elections that brought Nelson Mandela and the African National Congress to power in 1994. Drawing on the tactics, strategies and moral authority of the civil rights movement, antiapartheid campaigners mobilized public opinion while increasing African American influence in the formulation of US foreign policy. Long-lasting organizations such as the American Committee on Africa and TransAfrica called for boycotts and divestment while lobbying for economic sanctions. Utilizing tactics such as rallies, demonstrations, and nonviolent civil disobedience actions, antiapartheid activists made their voices heard on college campuses, corporate boardrooms, municipal and state governments, as well as the halls of Congress. Cultural expressions of criticism and resistance served to reinforce public sentiment against apartheid. Novels, plays, movies, and music provided a way for Americans to connect to the struggles of those suffering under apartheid. By extending the moral logic of the movement for African American civil rights, American anti-apartheid activists created a multicultural coalition that brought about institutional and governmental divestment from apartheid, prompted Congress to impose economic sanctions on South Africa, and increased the influence of African Americans regarding issues of race and American foreign policy.

Article

American policy toward the Arab-Israeli conflict has reflected dueling impulses at the heart of US-Middle East relations since World War II: growing support for Zionism and Israeli statehood on the one hand, the need for cheap oil resources and strong alliances with Arab states on the other, unfolding alongside the ebb and flow of concerns over Soviet influence in the region during the Cold War. These tensions have tracked with successive Arab–Israeli conflagrations, from the 1948 war through the international conflicts of 1967 and 1973, as well as shifting modes of intervention in Lebanon, and more recently, the Palestinian uprisings in the occupied territories and several wars on the Gaza Strip. US policy has been shaped by diverging priorities in domestic and foreign policy, a halting recognition of the need to tackle Palestinian national aspirations, and a burgeoning peace process which has drawn American diplomats into the position of mediating between the parties. Against the backdrop of regional upheaval, this long history of involvement continues into the 21st century as the unresolved conflict between Israel and the Arab world faces a host of new challenges.

Article

Jennifer Hoyt

Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

American history is replete with instances of counterinsurgency. An unsurprising reality considering the United States has always participated in empire building, thus the need to pacify resistance to expansion. For much of its existence, the U.S. has relied on its Army to pacify insurgents. While the U.S. Army used traditional military formations and use of technology to battle peer enemies, the same strategy did not succeed against opponents who relied on speed and surprise. Indeed, in several instances, insurgents sought to fight the U.S. Army on terms that rendered superior manpower and technology irrelevant. By introducing counterinsurgency as a strategy, the U.S. Army attempted to identify and neutralize insurgents and the infrastructure that supported them. Discussions of counterinsurgency include complex terms, thus readers are provided with simplified, yet accurate definitions and explanations. Moreover, understanding the relevant terms provided continuity between conflicts. While certain counterinsurgency measures worked during the American Civil War, the Indian Wars, and in the Philippines, the concept failed during the Vietnam War. The complexities of counterinsurgency require readers to familiarize themselves with its history, relevant scholarship, and terminology—in particular, counterinsurgency, pacification, and infrastructure.

Article

Michael J. Bustamante

The Cuban Revolution transformed the largest island nation of the Caribbean into a flashpoint of the Cold War. After overthrowing US-backed ruler Fulgencio Batista in early 1959, Fidel Castro established a socialist, anti-imperialist government that defied the island’s history as a dependent and dependable ally of the United States. But the Cuban Revolution is not only significant for its challenge to US interests and foreign policy prerogatives. For Cubans, it fundamentally reordered their lives, inspiring multitudes yet also driving thousands of others to migrate to Miami and other points north. Sixty years later, Fidel Castro may be dead and the Soviet Union may be long gone. Cuban socialism has become more hybrid in economic structure, and in 2014 the Cuban and US governments moved to restore diplomatic ties. But Cuba’s leaders continue to insist that “the Revolution,” far from a terminal political event, is still alive. Today, as the founding generation of Cuban leaders passes from the scene, “the Revolution” faces another important crossroads of uncertainty and reform.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

Probably no American president was more thoroughly versed in matters of national security and foreign policy before entering office than Dwight David Eisenhower. As a young military officer, Eisenhower served stateside in World War I and then in Panama and the Philippines in the interwar years. On assignments in Washington and Manila, he worked on war plans, gaining an understanding that national security entailed economic and psychological factors in addition to manpower and materiel. In World War II, he commanded Allied forces in the European Theatre of Operations and honed his skills in coalition building and diplomacy. After the war, he oversaw the German occupation and then became Army Chief of Staff as the nation hastily demobilized. At the onset of the Cold War, Eisenhower embraced President Harry S. Truman’s containment doctrine and participated in the discussions leading to the 1947 National Security Act establishing the Central Intelligence Agency, the National Security Council, and the Department of Defense. After briefly retiring from the military, Eisenhower twice returned to public service at the behest of President Truman to assume the temporary chairmanship of the Joint Chiefs of Staff and then, following the outbreak of the Korean War, to become the first Supreme Allied Commander, Europe, charged with transforming the North Atlantic Treaty Organization into a viable military force. These experiences colored Eisenhower’s foreign policy views, which in turn led him to seek the presidency. He viewed the Cold War as a long-term proposition and worried that Truman’s military buildup would overtax finite American resources. He sought a coherent strategic concept that would be sustainable over the long haul without adversely affecting the free enterprise system and American democratic institutions. He also worried that Republican Party leaders were dangerously insular. As president, his New Look policy pursued a cost-effective strategy of containment by means of increased reliance on nuclear forces over more expensive conventional ones, sustained existing regional alliances and developed new ones, sought an orderly process of decolonization under Western guidance, resorted to covert operations to safeguard vital interests, and employed psychological warfare in the battle with communism for world opinion, particularly in the so-called Third World. His foreign policy laid the basis for what would become the overall American strategy for the duration of the Cold War. The legacy of that policy, however, was decidedly mixed. Eisenhower avoided the disaster of global war, but technological innovations did not produce the fiscal savings that he had envisioned. The NATO alliance expanded and mostly stood firm, but other alliances were more problematic. Decolonization rarely proceeded as smoothly as envisioned and caused conflict with European allies. Covert operations had long-term negative consequences. In Southeast Asia and Cuba, the Eisenhower administration’s policies bequeathed a poisoned chalice for succeeding administrations.

Article

Jeffrey F. Taffet

In the first half of the 20th century, and more actively in the post–World War II period, the United States government used economic aid programs to advance its foreign policy interests. US policymakers generally believed that support for economic development in poorer countries would help create global stability, which would limit military threats and strengthen the global capitalist system. Aid was offered on a country-by-country basis to guide political development; its implementation reflected views about how humanity had advanced in richer countries and how it could and should similarly advance in poorer regions. Humanitarianism did play a role in driving US aid spending, but it was consistently secondary to political considerations. Overall, while funding varied over time, amounts spent were always substantial. Between 1946 and 2015, the United States offered almost $757 billion in economic assistance to countries around the world—$1.6 trillion in inflation-adjusted 2015 dollars. Assessing the impact of this spending is difficult; there has long been disagreement among scholars and politicians about how much economic growth, if any, resulted from aid spending and similar disputes about its utility in advancing US interests. Nevertheless, for most political leaders, even without solid evidence of successes, aid often seemed to be the best option for constructively engaging poorer countries and trying to create the kind of world in which the United States could be secure and prosperous.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

Henry Kissinger was the most famous and most controversial American diplomat of the second half of the 20th century. Escaping Nazi persecution in the 1930s, serving in the American Army of occupation in Germany after 1945, and then pursuing a successful academic career at Harvard University, Kissinger had already achieved national prominence as a foreign policy analyst and defense intellectual when he was appointed national security adviser by President Richard Nixon in January 1969. Kissinger quickly became the president’s closest adviser on foreign affairs and worked with Nixon to change American foreign policy in response to domestic upheaval caused by the Vietnam War in the late 1960s and early 1970s. Nixon and Kissinger’s initiatives, primarily détente with the Soviet Union, the opening to the People’s Republic of China, and ending American involvement in the Vietnam War, received strong domestic support and helped to bring about Nixon’s re-election landslide in 1972. In the wake of the Watergate scandal, Nixon appointed Kissinger secretary of state in August 1973. As Nixon’s capacity to govern deteriorated, Kissinger assumed all-but presidential powers, even putting American forces on alert during the Yom Kippur war and then engaging in “shuttle diplomacy” in the Middle East, achieving the first-ever agreements between Israel and Egypt and Israel and Syria. Kissinger retained a dominating influence over foreign affairs during the presidency of Gerald Ford, even as he became a lightning rod for critics on both the left and right of the political spectrum. Although out of public office after 1977, Kissinger remained in the public eye as a foreign policy commentator, wrote three volumes of memoirs as well as other substantial books on diplomacy, and created a successful international business-consulting firm. His only governmental positions were as chair of the Commission on Central America in 1983–1984 and a brief moment on the 9/11 Commission in 2002.

Article

In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.

Article

Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

Thomas A. Reinstein

The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community. At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.

Article

Mary S. Barton and David M. Wight

The US government’s perception of and response to international terrorism has undergone momentous shifts since first focusing on the issue in the early 20th century. The global rise of anarchist and communist violence provided the impetus for the first major US government programs aimed at combating international terrorism: restrictive immigration policies targeting perceived radicals. By the 1920s, the State Department emerged as the primary government agency crafting US responses to international terrorism, generally combating communist terrorism through diplomacy and information-sharing partnerships with foreign governments. The 1979 Iranian hostage crisis marked the beginning of two key shifts in US antiterrorism policy: a heightened focus on combating Islamist terrorism and a willingness to deploy military force to this end. The terrorist attacks of September 11, 2001, led US officials to conceptualize international terrorism as a high-level national security problem, leading to US military invasions and occupations of Afghanistan and Iraq, a broader use of special forces, and unprecedented intelligence-gathering operations.

Article

Malcolm Byrne

Iran-Contra was a major political scandal in the late 1980s that nearly derailed a popular president and left American society deeply divided about its significance. Although the affair was initially portrayed as a rogue operation run by overzealous White House aides, subsequent evidence showed that the president himself was its driving force with the knowledge of his most senior advisers. Iran-Contra was a foreign policy scandal, but it also gave rise to a significant confrontation between the executive and legislative branches with constitutional implications for their respective roles, especially in foreign policy. The affair exposed significant limits on the ability of all three branches to ferret out and redress official wrongdoing. And the entire episode, a major congressional investigation concluded, was characterized by a remarkable degree of dishonesty and deception, reaching to the highest levels of government. For all these reasons, and in the absence of a clear legal or ethical conclusion (in contrast to Watergate), Iran-Contra left a scar on the American body politic that further eroded the public’s faith in government.