1-20 of 30 Results  for:

  • 20th Century: Pre-1945 x
  • Foreign Relations and Foreign Policy x
Clear all

Article

In the midst of the long black freedom struggle, African American military participation in the First World War remains central to civil rights activism and challenges to systems of oppression in the United States. As part of a long and storied tradition of military service for a nation that marginalized and attempted to subjugate a significant portion of US citizens, African American soldiers faced challenges, racism, and segregation during the First World War simultaneously on the home front and the battlefields of France. The generations born since the end of the Civil War continually became more and more militant when resisting Jim Crow and insisting on full, not partial, citizenship in the United States, evidenced by the events in Houston in 1917. Support of the war effort within black communities in the United States was not universal, however, and some opposed participation in a war effort to “make the world safe for democracy” when that same democracy was denied to people of color. Activism by organizations like the National Association for the Advancement of Colored People (NAACP) challenged the War Department’s official and unofficial policy, creating avenues for a larger number of black officers in the US Army through the officers’ training camp created in Des Moines, Iowa. For African American soldiers sent to France with the American Expeditionary Forces (AEF), the potential for combat experience led to both failures and successes, leading to race pride as in the case of the 93rd Division’s successes, and skewed evidence for the War Department to reject increasing the number of black officers and enlisted in the case of the 92nd Division. All-black Regular Army regiments, meanwhile, either remained in the United States or were sent to the Philippines rather than the battlefields of Europe. However, soldiers’ return home was mixed, as they were both celebrated and rejected for their service, reflected in both parades welcoming them home and racial violence in the form of lynchings between December 1918 and January 1920. As a result, the interwar years and the start of World War II roughly two decades later renewed the desire to utilize military service as a way to influence US legal, social, cultural, and economic structures that limited African American citizenship.

Article

On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history. Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.

Article

Jennifer Hoyt

Relations between the United States and Argentina can be best described as a cautious embrace punctuated by moments of intense frustration. Although never the center of U.S.–Latin American relations, Argentina has attempted to create a position of influence in the region. As a result, the United States has worked with Argentina and other nations of the Southern Cone—the region of South America that comprises Uruguay, Paraguay, Argentina, Chile, and southern Brazil—on matters of trade and economic development as well as hemispheric security and leadership. While Argentina has attempted to assert its position as one of Latin America’s most developed nations and therefore a regional leader, the equal partnership sought from the United States never materialized for the Southern Cone nation. Instead, competition for markets and U.S. interventionist and unilateral tendencies kept Argentina from attaining the influence and wealth it so desired. At the same time, the United States saw Argentina as an unreliable ally too sensitive to the pull of its volatile domestic politics. The two nations enjoyed moments of cooperation in World War I, the Cold War, and the 1990s, when Argentine leaders could balance this particular external partnership with internal demands. Yet at these times Argentine leaders found themselves walking a fine line as detractors back home saw cooperation with the United States as a violation of their nation’s sovereignty and autonomy. There has always been potential for a productive partnership, but each side’s intransigence and unique concerns limited this relationship’s accomplishments and led to a historical imbalance of power.

Article

Black internationalism describes the political culture and intellectual practice forged in response to slavery, colonialism, and white imperialism. It is a historical and ongoing collective struggle against racial oppression rooted in global consciousness. While the expression of black internationalism has certainly changed across time and place, black liberation through collaboration has been and remains its ultimate goal. Since the emergence of black internationalism as a result of the transatlantic slave trade and during the Age of Revolutions, black women such as the poet Phyllis Wheatley and evangelist Rebecca Protten have been at its forefront. Their writings and activism espoused an Afro-diasporic, global consciousness and promoted the cause of universal emancipation. During the 19th century, black women internationalists included abolitionists, missionaries, and clubwomen. They built on the work of their predecessors while laying the foundations for succeeding black women internationalists in the early 20th century. By World War I, a new generation of black women activists and intellectuals remained crucial parts of the International Council of Women, an organization founded by white suffragists from the United States, and the Universal Negro Improvement Association, a global organization formally led by Jamaican pan-Africanist Marcus Garvey. But they also formed an independent organization, the International Council of Women of the Darker Races (ICWDR). Within and outside of the ICWDR, black women from Africa and the African Diaspora faced and challenged discrimination on the basis of their sex and race. Their activism and intellectual work set a powerful precedent for a subsequent wave of black internationalism shaped by self-avowed black feminists.

Article

Tyson Reeder

The United States has shared an intricate and turbulent history with Caribbean islands and nations since its inception. In its relations with the Caribbean, the United States has displayed the dueling tendencies of imperialism and anticolonialism that characterized its foreign policy with South America and the rest of the world. For nearly two and a half centuries, the Caribbean has stood at the epicenter of some of the US government’s most controversial and divisive foreign policies. After the American Revolution severed political ties between the United States and the British West Indies, US officials and traders hoped to expand their political and economic influence in the Caribbean. US trade in the Caribbean played an influential role in the events that led to the War of 1812. The Monroe Doctrine provided a blueprint for reconciling imperial ambitions in the Caribbean with anti-imperial sentiment. During the mid-19th century, Americans debated the propriety of annexing Caribbean islands, especially Cuba. After the Spanish-American War of 1898, the US government took an increasingly imperialist approach to its relations with the Caribbean, acquiring some islands as federal territories and augmenting its political, military, and economic influence in others. Contingents of the US population and government disapproved of such imperialistic measures, and beginning in the 1930s the US government softened, but did not relinquish, its influence in the Caribbean. Between the 1950s and the end of the Cold War, US officials wrestled with how to exert influence in the Caribbean in a postcolonial world. Since the end of the Cold War, the United States has intervened in Caribbean domestic politics to enhance democracy, continuing its oscillation between democratic and imperial impulses.

Article

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

The decolonization of the European overseas empires had its intellectual roots early in the modern era, but its culmination occurred during the Cold War that loomed large in post-1945 international history. This culmination thus coincided with the American rise to superpower status and presented the United States with a dilemma. While philosophically sympathetic to the aspirations of anticolonial nationalist movements abroad, the United States’ vastly greater postwar global security burdens made it averse to the instability that decolonization might bring and that communists might exploit. This fear, and the need to share those burdens with European allies who were themselves still colonial landlords, led Washington to proceed cautiously. The three “waves” of the decolonization process—medium-sized in the late 1940s, large in the half-decade around 1960, and small in the mid-1970s—prompted the American use of a variety of tools and techniques to influence how it unfolded. Prior to independence, this influence was usually channeled through the metropolitan authority then winding down. After independence, Washington continued and often expanded the use of these tools, in most cases on a bilateral basis. In some theaters, such as Korea, Vietnam, and the Congo, through the use of certain of these tools, notably covert espionage or overt military operations, Cold War dynamics enveloped, intensified, and repossessed local decolonization struggles. In most theaters, other tools, such as traditional or public diplomacy or economic or technical development aid, affixed the Cold War into the background as a local transition unfolded. In all cases, the overriding American imperative was to minimize instability and neutralize actors on the ground who could invite communist gains.

Article

Throughout US history, Americans have used ideas about gender to understand power, international relations, military behavior, and the conduct of war. Since Joan Wallach Scott called on scholars in 1986 to consider gender a “useful category of analysis,” historians have looked beyond traditional diplomatic and military sources and approaches to examine cultural sources, the media, and other evidence to try to understand the ideas that Americans have relied on to make sense of US involvement in the world. From casting weak nations as female to assuming that all soldiers are heterosexual males, Americans have deployed mainstream assumptions about men’s and women’s proper behavior to justify US diplomatic and military interventions in the world. State Department pamphlets describing newly independent countries in the 1950s and 1960s featured gendered imagery like the picture of a young Vietnamese woman on a bicycle that was meant to symbolize South Vietnam, a young nation in need of American guidance. Language in news reports and government cables, as well as film representations of international affairs and war, expressed gendered dichotomies such as protector and protected, home front and battlefront, strong and weak leadership, and stable and rogue states. These and other episodes illustrate how thoroughly gender shaped important dimensions about the character and the making of US foreign policy and historians’ examinations of diplomatic and military history.

Article

A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century. Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations. The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

Mary S. Barton and David M. Wight

The US government’s perception of and response to international terrorism has undergone momentous shifts since first focusing on the issue in the early 20th century. The global rise of anarchist and communist violence provided the impetus for the first major US government programs aimed at combating international terrorism: restrictive immigration policies targeting perceived radicals. By the 1920s, the State Department emerged as the primary government agency crafting US responses to international terrorism, generally combating communist terrorism through diplomacy and information-sharing partnerships with foreign governments. The 1979 Iranian hostage crisis marked the beginning of two key shifts in US antiterrorism policy: a heightened focus on combating Islamist terrorism and a willingness to deploy military force to this end. The terrorist attacks of September 11, 2001, led US officials to conceptualize international terrorism as a high-level national security problem, leading to US military invasions and occupations of Afghanistan and Iraq, a broader use of special forces, and unprecedented intelligence-gathering operations.

Article

Sophie Cooper

Irish and American histories are intertwined as a result of migration, mercantile and economic connections, and diplomatic pressures from governments and nonstate actors. The two fledgling nations were brought together by their shared histories of British colonialism, but America’s growth as an imperial power complicated any natural allegiances that were invoked across the centuries. Since the beginnings of that relationship in 1607 with the arrival of Irish migrants in America (both voluntary and forced) and the building of a transatlantic linen trade, the meaning of “Irish” has fluctuated in America, mirroring changes in both migrant patterns and international politics. The 19th century saw Ireland enter into Anglo-American diplomacy on both sides of the Atlantic, while the 20th century saw Ireland emerge from Britain’s shadow with the establishment of separate diplomatic connections between the United States and Ireland. American recognition of the newly independent Irish Free State was vital for Irish politicians on the world stage; however the Free State’s increasingly isolationist policies during the 1930s to 1950s alienated its American allies. The final decade of the century, however, brought America and Ireland (including both Northern Ireland and the Republic of Ireland) closer than ever before. Throughout their histories, the Irish diasporas—both Protestant and Catholic—in America have played vital roles as pressure groups and fundraisers. The history of American–Irish relations therefore brings together governmental and nonstate organizations and unites political, diplomatic, social, cultural, and economic histories which are still relevant today.

Article

Jennifer M. Miller

Over the past 150 years, the United States and Japan have developed one of the United States’ most significant international relationships, marked by a potent mix of cooperation and rivalry. After a devastating war, these two states built a lasting alliance that stands at the center of US diplomacy, security, and economic policy in the Pacific and beyond. Yet this relationship is not simply the product of economic or strategic calculations. Japan has repeatedly shaped American understandings of empire, hegemony, race, democracy, and globalization, because these two states have often developed in remarkable parallel with one another. From the edges of the international order in the 1850s and 1860s, both entered a period of intense state-building at home and imperial expansion abroad in the late 19th and early 20th centuries. These imperial ambitions violently collided in the 1940s in an epic contest to determine the Pacific geopolitical order. After its victory in World War II, the United States embarked on an unprecedented occupation designed to transform Japan into a stable and internationally cooperative democracy. The two countries also forged a diplomatic and security alliance that offered crucial logistical, political, and economic support to the United States’ Cold War quest to prevent the spread of communism. In the 1970s and 1980s, Japan’s rise as the globe’s second-largest economy caused significant tension in this relationship and forced Americans to confront the changing nature of national power and economic growth in a globalizing world. However, in recent decades, rising tensions in the Asia-Pacific have served to focus this alliance on the construction of a stable trans-Pacific economic and geopolitical order.

Article

The United States and the Kingdom of Joseon (Korea) established formal diplomatic relations after signing a “Treaty of Peace, Commerce, Amity, and Navigation” in 1882. Relations between the two states were not close and the United States closed its legation in 1905 following the Japanese annexation of Korea subsequent to the Russo-Japanese War. No formal relations existed for the following forty-four years, but American interest in Korea grew following the 1907 Pyongyang Revival and the rapid growth of Christianity there. Activists in the Korean Independence movement kept the issue of Korea alive in the United States, especially during World War I and World War II, and pressured the American government to support the re-emergence of an independent Korea. Their activism, as well as a distrust of the Soviet Union, was among the factors that spurred the United States to suggest the joint occupation of the Korean peninsula in 1945, which subsequently led to the creation of the Republic of Korea (ROK) in the American zone and the Democratic People’s Republic of Korea (DPRK) in the Soviet zone. The United States withdrew from the ROK in 1948 only to return in 1950 to thwart the DPRK’s attempt to reunite the peninsula by force during the Korean War. The war ended in stalemate, with an armistice agreement in 1953. In the same year the United States and the ROK signed a military alliance and American forces have remained on the peninsula ever since. While the United States has enjoyed close political and security relations with the ROK, formal diplomatic relations have never been established between the United States and the DPRK, and the relationship between the two has been marked by increasing tensions over the latter’s nuclear program since the early 1990s.

Article

Although the League of Nations was the first permanent organization established with the purpose of maintaining international peace, it built on the work of a series of 19th-century intergovernmental institutions. The destructiveness of World War I led American and British statesmen to champion a league as a means of maintaining postwar global order. In the United States, Woodrow Wilson followed his predecessors, Theodore Roosevelt and William Howard Taft, in advocating American membership of an international peace league, although Wilson’s vision for reforming global affairs was more radical. In Britain, public opinion had begun to coalesce in favor of a league from the outset of the war, though David Lloyd George and many of his Cabinet colleagues were initially skeptical of its benefits. However, Lloyd George was determined to establish an alliance with the United States and warmed to the league idea when Jan Christian Smuts presented a blueprint for an organization that served that end. The creation of the League was a predominantly British and American affair. Yet Wilson was unable to convince Americans to commit themselves to membership in the new organization. The Franco-British-dominated League enjoyed some early successes. Its high point was reached when Europe was infused with the “Spirit of Locarno” in the mid-1920s and the United States played an economically crucial, if politically constrained, role in advancing Continental peace. This tenuous basis for international order collapsed as a result of the economic chaos of the early 1930s, as the League proved incapable of containing the ambitions of revisionist powers in Europe and Asia. Despite its ultimate limitations as a peacekeeping body, recent scholarship has emphasized the League’s relative successes in stabilizing new states, safeguarding minorities, managing the evolution of colonies into notionally sovereign states, and policing transnational trafficking; in doing so, it paved the way for the creation of the United Nations.

Article

For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century. Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.

Article

In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.” De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it. A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West. These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.

Article

The United States was extremely reluctant to get drawn into the wars that erupted in Asia in 1937 and Europe in 1939. Deeply disillusioned with the experience of World War I, when the large number of trench warfare casualties had resulted in a peace that many American believed betrayed the aims they had fought for, the United States sought to avoid all forms of entangling alliances. Deeply embittered by the Depression, which was widely blamed on international bankers and businessmen, Congress enacted legislation that sought to prevent these actors from drawing the country into another war. The American aim was neutrality, but the underlying strength of the United States made it too big to be impartial—a problem that Roosevelt had to grapple with as Germany, Italy, and Japan began to challenge international order in the second half of the 1930s.