1-8 of 8 Results  for:

  • Keywords: foreignness x
  • 20th Century: Pre-1945 x
Clear all

Article

A fear of foreignness shaped the immigration foreign policies of the United States up to the end of World War II. US leaders perceived nonwhite peoples of Latin America, Asia, and Europe as racially inferior, and feared that contact with them, even annexation of their territories, would invite their foreign mores, customs, and ideologies into US society. This belief in nonwhite peoples’ foreignness also influenced US immigration policy, as Washington codified laws that prohibited the immigration of nonwhite peoples to the United States, even as immigration was deemed a net gain for a US economy that was rapidly industrializing from the late 19th century to the first half of the 20th century. Ironically, this fear of foreignness fostered an aggressive US foreign policy for many of the years under study, as US leaders feared that European intervention into Latin America, for example, would undermine the United States’ regional hegemony. The fear of foreignness that seemed to oblige the United States to shore up its national security interests vis-à-vis European empires also demanded US intervention into the internal affairs of nonwhite nations. For US leaders, fear of foreignness was a two-sided coin: European aggression was encouraged by the internal instability of nonwhite nations, and nonwhite nations were unstable—and hence ripe pickings for Europe’s empires—because their citizens were racially inferior. To forestall both of these simultaneous foreign threats, the United States increasingly embedded itself into the political and economic affairs of foreign nations. The irony of opportunity, of territorial acquisitions as well as immigrants who fed US labor markets, and fear, of European encroachment and the racial inferiority of nonwhite peoples, lay at the root of the immigration and foreign policies of the United States up to 1945.

Article

The United States was extremely reluctant to get drawn into the wars that erupted in Asia in 1937 and Europe in 1939. Deeply disillusioned with the experience of World War I, when the large number of trench warfare casualties had resulted in a peace that many American believed betrayed the aims they had fought for, the United States sought to avoid all forms of entangling alliances. Deeply embittered by the Depression, which was widely blamed on international bankers and businessmen, Congress enacted legislation that sought to prevent these actors from drawing the country into another war. The American aim was neutrality, but the underlying strength of the United States made it too big to be impartial—a problem that Roosevelt had to grapple with as Germany, Italy, and Japan began to challenge international order in the second half of the 1930s.

Article

For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century. Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.

Article

Over the first half of the 20th century, Rabbi Stephen S. Wise (1874–1949) devoted himself to solving the most controversial social and political problems of his day: corruption in municipal politics, abuse of industrial workers, women’s second-class citizenship, nativism and racism, and global war. He considered his activities an effort to define “Americanism” and apply its principles toward humanity’s improvement. On the one hand, Wise joined a long tradition of American Christian liberals committed to seeing their fellow citizens as their equals and to grounding this egalitarianism in their religious beliefs. On the other hand, he was in the vanguard of the Jewish Reform, or what he referred to as the Liberal Judaism movement, with its commitment to apply Jewish moral teachings to improve the world. His life’s work demonstrated that the two—liberal democracy and Liberal Judaism—went hand in hand. And while concerned with equality and justice, Wise’s Americanism had a democratic elitist character. His advocacy to engage the public on the meaning of citizenship and the role of the state relied on his own Jewish, male, and economically privileged perspective as well as those of an elite circle of political and business leaders, intellectual trendsetters, social scientists, philanthropists, labor leaders, and university faculty. In doing so, Wise drew upon on Jewish liberal teachings, transformed America’s liberal tradition, and helped to remake American’s national understanding of itself.

Article

The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease. American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.

Article

Patrick William Kelly

The relationship between Chile and the United States pivoted on the intertwined questions of how much political and economic influence Americans would exert over Chile and the degree to which Chileans could chart their own path. Given Chile’s tradition of constitutional government and relative economic development, it established itself as a regional power player in Latin America. Unencumbered by direct US military interventions that marked the history of the Caribbean, Central America, and Mexico, Chile was a leader in movements to promote Pan-Americanism, inter-American solidarity, and anti-imperialism. But the advent of the Cold War in the 1940s, and especially after the 1959 Cuban Revolution, brought an increase in bilateral tensions. The United States turned Chile into a “model democracy” for the Alliance for Progress, but frustration over its failures to enact meaningful social and economic reform polarized Chilean society, resulting in the election of Marxist Salvador Allende in 1970. The most contentious period in US-Chilean relations was during the Nixon administration when it worked, alongside anti-Allende Chileans, to destabilize Allende’s government, which the Chilean military overthrew on September 11, 1973. The Pinochet dictatorship (1973–1990), while anti-Communist, clashed with the United States over Pinochet’s radicalization of the Cold War and the issue of Chilean human rights abuses. The Reagan administration—which came to power on a platform that reversed the Carter administration’s critique of Chile—reversed course and began to support the return of democracy to Chile, which took place in 1990. Since then, Pinochet’s legacy of neoliberal restructuring of the Chilean economy looms large, overshadowed perhaps only by his unexpected role in fomenting a global culture of human rights that has ended the era of impunity for Latin American dictators.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.