1-20 of 22 Results

  • Keywords: World War II x
Clear all

Article

Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere. Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin. Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.

Article

Thomas A. Reinstein

The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community. At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.

Article

Jennifer M. Miller

Over the past 150 years, the United States and Japan have developed one of the United States’ most significant international relationships, marked by a potent mix of cooperation and rivalry. After a devastating war, these two states built a lasting alliance that stands at the center of US diplomacy, security, and economic policy in the Pacific and beyond. Yet this relationship is not simply the product of economic or strategic calculations. Japan has repeatedly shaped American understandings of empire, hegemony, race, democracy, and globalization, because these two states have often developed in remarkable parallel with one another. From the edges of the international order in the 1850s and 1860s, both entered a period of intense state-building at home and imperial expansion abroad in the late 19th and early 20th centuries. These imperial ambitions violently collided in the 1940s in an epic contest to determine the Pacific geopolitical order. After its victory in World War II, the United States embarked on an unprecedented occupation designed to transform Japan into a stable and internationally cooperative democracy. The two countries also forged a diplomatic and security alliance that offered crucial logistical, political, and economic support to the United States’ Cold War quest to prevent the spread of communism. In the 1970s and 1980s, Japan’s rise as the globe’s second-largest economy caused significant tension in this relationship and forced Americans to confront the changing nature of national power and economic growth in a globalizing world. However, in recent decades, rising tensions in the Asia-Pacific have served to focus this alliance on the construction of a stable trans-Pacific economic and geopolitical order.

Article

An overview of Euro-American internal migration in the United States between 1940 and 1980 explores the overall population movement away from rural areas to cities and suburban areas. Although focused on white Americans and their migrations, there are similarities to the Great Migration of African Americans, who continued to move out of the South during the mid-20th century. In the early period, the industrial areas in the North and West attracted most of the migrants. Mobilization for World War II loosened rural dwellers who were long kept in place by low wages, political disfranchisement, and low educational attainment. The war also attracted significant numbers of women to urban centers in the North and West. After the war, migration increased, enticing white Americans to become not just less rural but also increasingly suburban. The growth of suburbs throughout the country was prompted by racial segregation in housing that made many suburban areas white and earmarked many urban areas for people of color. The result was incredible growth in suburbia: from 22 million living in those areas in 1940 to triple that in 1970. Later in the period, as the Steelbelt rusted, the rise of the West as a migration magnet was spurred by development strategies, federal investment in infrastructure, and military bases. Sunbelt areas were making investments that stood ready to recruit industries and of course people, especially from Rustbelt areas in the North. By the dawn of the 21st century, half of the American population resided in suburbs.

Article

Megan Threlkeld

The issue of compulsory military service has been contested in the United States since before its founding. In a nation characterized by both liberalism and republicanism, there is an inherent tension between the idea that individuals should be able to determine their own destiny and the idea that all citizens have a duty to serve their country. Prior to the 20th century, conscription occurred mainly on the level of local militias, first in the British colonies and later in individual states. It was during the Civil War that the first federal drafts were instituted, both in the Union and the Confederacy. In the North, the draft was unpopular and largely ineffective. Congress revived national conscription when the United States entered World War I and established the Selective Service System to oversee the process. That draft ended when U.S. belligerency ended in 1918. The first peacetime draft was implemented in 1940; with the exception of one year, it remained in effect until 1973. Its most controversial days came during the Vietnam War, when thousands of people across the country demonstrated against it and, in some cases, outright refused to be inducted. The draft stopped with the end of the war, but in 1980, Congress reinstated compulsory Selective Service registration. More than two decades into the 21st century, male citizens and immigrant noncitizens are still required to register within thirty days of their eighteenth birthday. The very idea of “selective service” is ambiguous. It is selective because not everyone is conscripted, but it is compulsory because one can be prosecuted for failing to register or to comply with orders of draft boards. Especially during the Cold War, one of the system’s main functions was not to procure soldiers but to identify and exempt from service those men best suited for other endeavors framed as national service: higher education, careers in science and engineering, and even supporting families. That fact, combined with the decentralized nature of the Selective Service System itself, left the process vulnerable to the prejudices of local draft boards and meant that those most likely to be drafted were poor and nonwhite.

Article

Since the late 19th century, the relationship between journalists and the makers of US foreign policy has been both cooperative and contentious. Reporters depend on government officials for information about policy decisions and their implementation. The White House, the State Department, and the Pentagon use the news media to build support for their policies and, at times, to communicate directly with allies and adversaries. Since World War I, presidential administrations have developed increasingly sophisticated methods to manage the news and influence public understanding of international affairs. Wartime censorship has been one tool of news management. Self-censorship, however, has also affected coverage of international affairs, as journalists have voluntarily refrained from publishing information for fear of impairing national security or undermining support for US wartime or Cold War policies. Allegations of bias and sensationalism became acrimonious during the Vietnam War and have continued to shape the debate about accurate, critical, and legitimate reporting. Arguments over “fake news,” which became commonplace during the presidency of Donald J. Trump, have many precursors, as both journalists and government officials have been responsible for misleading or distorted news coverage of international affairs since the Spanish–American War.

Article

Franklin D. Roosevelt was US president in extraordinarily challenging times. The impact of both the Great Depression and World War II make discussion of his approach to foreign relations by historians highly contested and controversial. He was one of the most experienced people to hold office, having served in the Wilson administration as Assistant Secretary of the Navy, completed two terms as Governor of New York, and held a raft of political offices. At heart, he was an internationalist who believed in an engaged and active role for the United States in world. During his first two terms as president, Roosevelt had to temper his international engagement in response to public opinion and politicians wanting to focus on domestic problems and wary of the risks of involvement in conflict. As the world crisis deepened in the 1930s, his engagement revived. He adopted a gradualist approach to educating the American people in the dangers facing their country and led them to eventual participation in war and a greater role in world affairs. There were clearly mistakes in his diplomacy along the way and his leadership often appeared flawed, with an ambiguous legacy founded on political expediency, expanded executive power, vague idealism, and a chronic lack of clarity to prepare Americans for postwar challenges. Nevertheless, his policies to prepare the United States for the coming war saw his country emerge from years of depression to become an economic superpower. Likewise, his mobilization of his country’s enormous resources, support of key allies, and the holding together of a “Grand Alliance” in World War II not only brought victory but saw the United States become a dominant force in the world. Ultimately, Roosevelt’s idealistic vision, tempered with a sound appreciation of national power, would transform the global position of the United States and inaugurate what Henry Luce described as “the American Century.”

Article

Canada has sometimes been called the United States’ attic: a useful feature, but one easily forgotten. Of all countries, it has historically resembled the United States the most closely, in terms of culture, geography, economy, society, politics, ideology and, especially, history. A shared culture—literary, social, legal, and political—is a crucial factor in Canadian-American relations. Geography is at least as important. It provides the United States with strategic insulation to the north and enhances geographic isolation to the east and west. North-south economic links are inevitable and very large. It has been a major recipient of American investment, and for most of the time since 1920 has been the United States’ principal trading partner. Prosperous and self-sufficient, it has seldom required American aid. There have been no overtly hostile official encounters since the end of the War of 1812, partly because many Americans tended to believe that Canadians would join the republic; when that did not occur, the United States accepted an independent but friendly Canada as a permanent, useful, and desirable neighbor—North America’s attic. The insulation the attic provided was a common belief in the rule of law, both domestic and international; liberal democracy; a federal constitution; liberal capitalism; and liberal international trade regimes. That said, the United States, with its large population, huge economy, and military power, insulates Canada from hostile external forces. An attack on Canada from outside the continent is hard to imagine without a simultaneous attack on the United States. Successive American and Canadian governments have reaffirmed the political status quo while favoring mutually beneficial economic and military linkages—bilateral and multilateral. Relations have traditionally been grounded in a negotiating style that is evidence-based, proceeding issue by issue. A sober diplomatic and political context sometimes frames irritations and exclamations, but even these have usually been defined and limited by familiarity. For example, there has always been anti-Americanism in Canada. Most often it consists of sentiments derived from the United States itself, channeled by cultural similarities. No American idea, good or bad, from liberalism to populism, fails to find an echo in Canada. How loud or how soft the echo makes the difference.

Article

The United States was extremely reluctant to get drawn into the wars that erupted in Asia in 1937 and Europe in 1939. Deeply disillusioned with the experience of World War I, when the large number of trench warfare casualties had resulted in a peace that many American believed betrayed the aims they had fought for, the United States sought to avoid all forms of entangling alliances. Deeply embittered by the Depression, which was widely blamed on international bankers and businessmen, Congress enacted legislation that sought to prevent these actors from drawing the country into another war. The American aim was neutrality, but the underlying strength of the United States made it too big to be impartial—a problem that Roosevelt had to grapple with as Germany, Italy, and Japan began to challenge international order in the second half of the 1930s.

Article

In May 1906, museum workers from across the country gathered in New York City at the American Museum of Natural History for the first annual meeting of the American Association of Museums (AAM). Over the course of two days, AAM members elected officers, ratified a constitution, and shared ideas about how best to collect, store, and display objects and specimens. The meeting culminated with a resolution to create a formal partnership with the National Education Association (NEA). AAM members’ interest in linking their work with the NEA signified that by the early 20th century, most museum leaders agreed that educating the public was a priority. This commitment to education shaped exhibition and collecting practices and the services that museums provided and expanded the power of museum visitors and audiences. While administrators, curators, and exhibit preparers often agreed on the collective goal of educating the public, their approaches varied. How museum education was defined and assessed depended on the type of museum in which one was employed, and it changed over time in response to broader social, cultural, and political forces. By 1945, however, museums of all types had formalized and institutionalized their practices in ways that placed education at the core of their purpose and actions.

Article

The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.

Article

Melissa A. McEuen

The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service. The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others. However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.

Article

Gregory F. Domber

American policy makers have rarely elevated Eastern Europe to the pinnacle of American grand strategy. The United States’ and Eastern Europe’s histories, however, are intertwined through the exchange of people and shared experiences. In the Age of Revolution, Eastern Europeans traveled to the United States to fight for the same causes they championed at home: to break from imperial control and expand the rights of man. At the end of the 19th century, “New Immigrants” from Eastern Europe streamed into America’s expanding cities. When countries in the region have moved to the forefront of American concerns during specific crises, Eastern European interests were regularly deemed secondary to larger American geopolitical interests. This holds true for the settlement of World War I, the conclusion of World War II, and the entirety of the Cold War. Overall, including Eastern Europeans and Eastern Europe in the history of the United States provides essential nuance and texture to broader patterns in American relations and more often than not provides evidence of the limitations of American power as it is altered by competing powers and local conditions.

Article

Leopoldo Nuti and Daniele Fiorentino

Relations between Italy and the United States have gone through different stages, from the early process of nation-building during the 18th and the 19th centuries, to the close diplomatic and political alignment of the Cold War and the first two decades of the 21st century. Throughout these two and a half centuries, relations between the two states occasionally experienced some difficult moments—from the tensions connected to the mass immigration of Italians to the United States at the end of the 19th century, to the diplomatic clash at the Versailles Peace Conference at the end of World War I, culminating with the declaration of war by the Fascist government in December 1941. By and large, however, Italy and the United States have mostly enjoyed a strong relationship based on close cultural, economic, and political ties.

Article

Justus D. Doenecke

For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements. Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy. If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.

Article

Since the founding of the United States of America, coinciding with the height of the Atlantic slave trade, U.S. officials have based their relations with West Africa primarily on economic interests. Initially, these interests were established on the backs of slaves, as the Southern plantation economy quickly vaulted the United States to prominence in the Atlantic world. After the U.S. abolition of the slave trade in 1808, however, American relations with West Africa focused on the establishment of the American colony of Liberia as a place of “return” for formerly enslaved persons. Following the turn to “legitimate commerce” in the Atlantic and the U.S. Civil War, the United States largely withdrew from large-scale interaction with West Africa. Liberia remained the notable exception, where prominent Pan-African leaders like Edward Blyden, W. E. B. DuBois, and Marcus Garvey helped foster cultural and intellectual ties between West Africa and the Diaspora in the early 1900s. These ties to Liberia were deepened in the 1920s when Firestone Rubber Corporation of Akron, Ohio established a long-term lease to harvest rubber. World War II marked a significant increase in American presence and influence in West Africa. Still focused on Liberia, the war years saw the construction of infrastructure that would prove essential to Allied war efforts and to American security interests during the Cold War. After 1945, the United States competed with the Soviet Union in West Africa for influence and access to important economic and national security resources as African nations ejected colonial regimes across most of the continent. West African independence quickly demonstrated a turn from nationalism to ethnic nationalism, as civil wars engulfed several countries in the postcolonial, and particularly the post-Cold War, era. After a decade of withdrawal, American interest in West Africa revived with the need for alternative sources of petroleum and concerns about transnational terrorism following the attacks of September 11, 2001.

Article

Relations between the United States and Mexico have rarely been easy. Ever since the United States invaded its southern neighbor and seized half of its national territory in the 19th century, the two countries have struggled to establish a relationship based on mutual trust and respect. Over the two centuries since Mexico’s independence, the governments and citizens of both countries have played central roles in shaping each other’s political, economic, social, and cultural development. Although this process has involved—even required—a great deal of cooperation, relations between the United States and Mexico have more often been characterized by antagonism, exploitation, and unilateralism. This long history of tensions has contributed to the three greatest challenges that these countries face together today: economic development, immigration, and drug-related violence.

Article

During the 20th century, the black population of the United States transitioned from largely rural to mostly urban. In the early 1900s the majority of African Americans lived in rural, agricultural areas. Depictions of black people in popular culture often focused on pastoral settings, like the cotton fields of the rural South. But a dramatic shift occurred during the Great Migrations (1914–1930 and 1941–1970) when millions of rural black southerners relocated to US cities. Motivated by economic opportunities in urban industrial areas during World Wars I and II, African Americans opted to move to southern cities as well as to urban centers in the Northeast, Midwest, and West Coast. New communities emerged that contained black social and cultural institutions, and musical and literary expressions flourished. Black migrants who left the South exercised voting rights, sending the first black representatives to Congress in the 20th century. Migrants often referred to themselves as “New Negroes,” pointing to their social, political, and cultural achievements, as well as their use of armed self-defense during violent racial confrontations, as evidence of their new stance on race.

Article

Kathryn C. Statler

U.S.-French relations are long-standing, complex, and primarily cooperative in nature. Various crises have punctuated long periods of stability in the alliance, but after each conflict the Franco-American friendship emerged stronger than ever. Official U.S.-French relations began during the early stages of the American Revolution, when Louis XVI’s regime came to America’s aid by providing money, arms, and military advisers. French assistance, best symbolized by the Marquis de Lafayette, was essential in the revolution’s success. The subsequent French Revolution and Napoleon Bonaparte’s rise to power also benefitted the United States when Napoleon’s woes in Europe and the Caribbean forced him to sell the entire Louisiana territory to the United States, in 1803. Franco-American economic and cultural contacts increased throughout the 19th century, as trade between the two countries prospered and as Americans flocked to France to study art, architecture, music, and medicine. The French gift of the Statue of Liberty in the late 19th century solidified Franco-American bonds, which became even more secure during World War I. Indeed, during the war, the United States provided France with trade, loans, military assistance, and millions of soldiers, viewing such aid as repayment for French help during the American Revolution. World War II once again saw the United States fighting in France to liberate the country from Nazi control. The Cold War complicated the Franco-American relationship in new ways as American power waxed and French power waned. Washington and Paris clashed over military conflict in Vietnam, the Suez Crisis, and European security (the North Atlantic Treaty Organization or NATO, in particular) during the 1950s and 1960s. Ultimately, after French President Charles de Gaulle’s retirement, the Franco-American alliance stabilized by the mid-1970s and has flourished ever since, despite brief moments of crisis, such as the 2003 Second Gulf War in Iraq.

Article

The war against Japan (1941–1945) gave rise to a uniquely enduring alliance between the United States, Australia, and New Zealand. Rooted in overlapping geopolitical interests and shared Western traditions, tripartite relationships forged in the struggles against fascism in World War II deepened as Cold War conflicts erupted in East and Southeast Asia. War in Korea drew the three Pacific democracies into a formal alliance, ANZUS. In the aftermath of defeat in Vietnam, however, American hegemony confronted new challenges, regionally and globally. A more fluid geopolitical environment replaced the alliance certainties of the early Cold War. ANZUS splintered but was not permanently broken. Thus the ebb and flow of tripartite relationships from the attack on Pearl Harbor to the first decades of the “Pacific Century” shifted as the “war on terror” and, in a very different way, the “rise of China,” revitalized trilateral cooperation and resuscitated the ANZUS agreement.