You are looking at 161-180 of 364 articles
Malinda Maynor Lowery
The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination.
Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.
Mark A. Granquist
Lutherans are one branch of Protestant Christianity and have been in America for almost 400 years. Historically they have immigrated to America from Lutheran countries in Europe, especially Germany and Scandinavia. Immigrants during the eighteenth century founded Lutheran congregations in the middle colonies, while westward expansion and further immigration from Europe centered Lutherans in the American Midwest. Lutherans formed regional and national denominations based on geography, ethnicity, and theological differences, In the twentieth century they continued to grow, and mergers reduced the numbers of denominations by 1988 to two major denominations: the Evangelical Lutheran Church in America and the Lutheran Church-Missouri Synod. In 2015 there were close to seven million Lutherans in America.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.
Ramón A. Gutiérrez
The history of Mexican immigration to the United States is best characterized as the movement of unskilled, manual laborers pushed northward mostly by poverty and unemployment and pulled into American labor markets with higher wages. Historically, most Mexicans have been economic immigrants seeking to improve their lives. In moments of civil strife, such as the Mexican Revolution (1910–1917) and the Cristero Revolt (1926–1929), many fled to the United States to escape religious and political persecution. Others, chafing under the weight of conservative, patriarchal, tradition-bound, rural agrarian societies, have migrated seeking modern values and greater personal liberties.
Since the last quarter of the 19th century, due to increasing numeric restrictions on the importation of immigrant workers from Europe, Asia, and Africa, American employers have turned to Mexico to recruit cheap, unskilled labor. Before 1942, Mexico minimally regulated emigration. While attentive to the safety and well-being of its émigrés, the Mexican government deemed out-migration a depletion of the country’s human capital. Monetary remittances helped compensate for this loss, contributing perhaps as much as 10 percent of the country’s yearly gross national product, vastly improving national life, particularly when emigrants returned with skills and consumer goods, seeking investment opportunities for their accumulated cash. Since the 1980s, single Mexican women have become a significant component of this migration, representing 40 percent of the total immigrant flow, employed mostly as service workers, domestics, and nannies, and less so in agricultural work. Mexicans also have gained authorized entry into the United States as highly skilled professionals, but their numbers remain relatively small in comparison to unskilled laborers. Beginning in 1942, and particularly in the 1990s, Mexican immigrants have been stigmatized as illegal aliens, subject to deportation as significant security threats to the nation; a rhetoric that intensified after the September 11, 2001 attacks on the United States by al-Qaeda.
Benjamin H. Johnson
When rebels captured the border city of Juárez, Mexico, in May 1911 and forced the abdication of President Porfirio Díaz shortly thereafter, they not only overthrew the western hemisphere’s oldest regime but also inaugurated the first social revolution of the 20th century. Driven by disenchantment with an authoritarian regime that catered to foreign investment, labor exploitation, and landlessness, revolutionaries dislodged Díaz’s regime, crushed an effort to resurrect it, and then spent the rest of the decade fighting one another for control of the nation. This struggle, recognized ever since as foundational for Mexican politics and identity, also had enormous consequences for the ethnic makeup, border policing, and foreign policy of the United States. Over a million Mexicans fled north during the 1910s, perhaps tripling the country’s Mexican-descent population, most visibly in places such as Los Angeles that had become overwhelmingly Anglo-American. US forces occupied Mexican territory twice, nearly bringing the two nations to outright warfare for the first time since the US–Mexican War of 1846–1848. Moreover, revolutionary violence and radicalism transformed the ways that much of the American population and its government perceived their border with Mexico, providing a rationale for a much more highly policed border and for the increasingly brutal treatment of Mexican-descent people in the United States. The Mexican Revolution was a turning point for Mexico, the United States, and their shared border, and for all who crossed it.
The military history of the American Revolution is more than the history of the War of Independence. The Revolution itself had important military causes. The experience of the Seven Years’ War (which started in 1754 in North America) conditioned British attitudes to the colonies after that conflict was over. From 1764, the British Parliament tried to raise taxes in America to pay for a new permanent military garrison. British politicians resisted colonial objections to parliamentary taxation at least partly because they feared that if the Americans established their right not to be taxed by Westminster, Parliament’s right to regulate colonial overseas trade would then be challenged. If the Americans broke out of the system of trade regulation, British ministers, MPs, and peers worried, then the Royal Navy would be seriously weakened.
The War of Independence, which began in 1775, was not the great American triumph that most accounts suggest. The British army faced a difficult task in suppressing a rebellion three thousand miles from Britain itself. French intervention on the American side in 1778 (followed by the Spanish in 1779, and the Dutch in 1780) made the task still more difficult. In the end, the war in America was won by the French as much as by the Americans. But in the wider imperial conflict, affecting the Caribbean, Central America, Europe, West Africa, and South Asia, the British fared much better. Even in its American dimension, the outcome was less clear cut than we usually imagine. The British, the nominal losers, retained great influence in the independent United States, which in economic terms remained in an essentially dependent relationship with the former mother country.
Wayne Wei-Siang Hsieh
Despite the absence of a robust and well-articulated conception of strategy, American military and political leaders during the Civil War had an intuitive sense of how military operations should be coordinated with larger political ends. They also shared a general adherence to the straightforward strategic ideas of Antoine-Henri de Jomini, who emphasized the importance of concentrating one’s own military forces in opposition to dispersed opponents. In the case of the Union, however, victory would require not only a more sophisticated conception of strategy that superseded Jomini and coordinated military operations in geographically disconnected fronts but also the practical implementation of such ideas through well-selected subordinate commanders. It would take Ulysses S. Grant until the end of the war to complete all these tasks. In the case of the Confederacy, secessionist leaders faced the challenge of prioritizing different theaters in the face of their material inferiority to the Union. Robert E. Lee chose the plausible strategy of striking directly at Northern public opinion with aggressive operations waged by his own Army of Northern Virginia, but the final failure of the Confederate war effort raises fair questions about whether the Confederacy should have paid more attention to its western theater.
The relationship between the Church of Jesus Christ of Latter-day Saints—commonly called “Mormonism”—and the politics and culture of the United States is both contentious and intertwined. Historians have commonly observed that Mormonism is in many ways quintessentially American, bearing the marks of the Jacksonian period in which it was born. Its rejection of the denominational leadership of its day, its institution of a lay priesthood, and Joseph Smith’s insistence that revelation trumped scholarship and study all marked it as very much of its time and place, an America in which the authority of common people was exalted and tradition authority was suspect. And yet at the same time, Mormonism was suspect almost immediately upon its birth for those things that made it appear distinctly un-American: the divine power of its prophetic leaders, its rejection of the sole authority of the Bible, its clannishness and separatism, and its defiance of 19th-century sexual morality.
The history of Mormonism in America is in many ways a tug of war between these two impulses. At times the Mormons have embraced what makes them American, have proudly claimed elements of national identity, and have claimed that their faith most truly embodies the American creed. At other times, however, either because of hostility from other Americans or because of their own separatism, Mormons have distanced themselves from the national community and sought a separate community and peoplehood. Through the 19th century, because of the practice of polygamy and the theocratic government of the Utah territory, both Mormons and other Americans perceived a gap between their two communities, but that gap closed by the end of the century, when the federal government used force to eliminate those things Americans most objected to about the faith and Mormons began aggressively pursuing assimilation into American life. By the end of the 20th century, however, Mormonism’s cultural conservatism led both Mormons and other Americans to see that gap opening once more.
The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The history of Muslims in America dates back to the transatlantic mercantile interactions between Europe, Africa, and the Americas. Upon its arrival, Islam became entrenched in American discourses on race and civilization because literate and noble African Muslims, brought to America as slaves, had problematized popular stereotypes of Muslims and black Africans. Furthermore, these enslaved Muslims had to re-evaluate and reconfigure their beliefs and practices to form new communal relations and to make sense of their lives in America.
At the turn of the 20th century, as Muslim immigrants began arriving in the United States from the Middle East, Eastern Europe, and South Asia, they had to establish themselves in an America in which the white race, Protestantism, and progress were conflated to define a triumphalist American national identity, one that allowed varying levels of inclusion for Muslims based on their ethnic, racial, and national backgrounds.
The enormous bloodshed and destruction experienced during World War I ushered in a crisis of confidence in the ideals of the European Enlightenment, as well as in white, Protestant nationalism. It opened up avenues for alternative expressions of progress, which allowed Muslims, along with other nonwhite, non-Christian communities, to engage in political and social organization. Among these organizations were a number of black religious movements that used Islamic beliefs, rites, and symbols to define a black Muslim national identity.
World War II further shifted America, away from the religious competition that had earlier defined the nation’s identity and toward a “civil religion” of American democratic values and political institutions. Although this inclusive rhetoric was received differently along racial and ethnic lines, there was an overall appeal for greater visibility for Muslims in America. After World War II, increased commercial and diplomatic relations between the United States and Muslim-majority countries put American Muslims in a position, not only to relate Islam and America in their own lives but also to mediate between the varying interests of Muslim-majority countries and the United States.
Following the civil rights legislation of the 1950s and 1960s and the passage of the Immigration Act of 1965, Muslim activists, many of whom had been politicized by anticolonial movements abroad, established new Islamic institutions. Eventually, a window was opened between the US government and American Muslim activists, who found a common enemy in communism following the Soviet occupation of Afghanistan in the 1980s.
Since the late 1960s, the number of Muslims in the United States has grown significantly. Today, Muslims are estimated to constitute a little more than 1 percent of the US population. However, with the fall of the Soviet Union and the rise of the United States as the sole superpower in the world, the United States has come into military conflict with Muslim-majority countries and has been the target of attacks by militant Muslim organizations. This has led to the cultivation of the binaries of “Islam and the West” and of “good” Islam and “bad” Islam, which have contributed to the racialization of American Muslims. It has also interpolated them into a reality external to their history and lived experiences as Muslims and Americans.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Urban renewal refers to an interlocking set of national and local policies, programs, and projects, implemented in the vast majority of American cities between 1949 and 1973. These typically entailed major redevelopment of existing urban areas with a view to the modernization of housing, highway infrastructure, commercial and business districts, as well as other large-scale constructions. Reformers from the Progressive Era through the Great Society strove to ameliorate the conditions of poverty and inequality in American cities by focusing primarily on physical transformation of the urban built environment. Citing antecedents such as the reconstruction of Second Empire Paris, imported via the City Beautiful movement, and then updated with midcentury modernism, US urban planners envisioned a radical reorganization of city life. In practice, federal programs and local public authorities targeted the eradication of areas deemed slums or blighted—often as much to socially sanitize neighborhoods inhabited by racial minorities and other marginalized groups as to address deteriorating physical conditions. And while federal funding became available for public works projects in declining central cities under the auspices of improving living conditions for the poor—including providing public housing—urban renewal programs consistently destroyed more affordable housing than they created, over more than three decades. By the end of the 1960s, urban residents and policymakers across the political spectrum concluded that such programs were usually doing more harm than good, and most ended during the Nixon administration. Yet large-scale reminders of urban renewal can still be found in most large US communities, whether in the form of mid-20th-century public housing blocks, transportation projects, stadiums, convention centers, university and hospital expansions, or a variety of public-private redevelopment initiatives. But perhaps the most fundamental legacies of all were the institutionalization of the comprehensive zoning and master planning process in cities nationwide, on the one hand, and the countervailing mobilization of defensively oriented (NIMBY) neighborhood politics, on the other.
Nicolas G. Rosenthal
An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements.
Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
In late 19th- and early 20th-century America, a new image of womanhood emerged that began to shape public views and understandings of women’s role in society.
Identified by contemporaries as a Gibson Girl, a suffragist, a Progressive reformer, a bohemian feminist, a college girl, a bicyclist, a flapper, a working-class militant, or a Hollywood vamp, all of these images came to epitomize the New Woman, an umbrella term for modern understandings of femininity. Referring both to real, flesh-and-blood women, and also to an abstract idea or a visual archetype, the New Woman represented a generation of women who came of age between 1890 and 1920 and challenged gender norms and structures by asserting a new public presence through work, education, entertainment, and politics, while also denoting a distinctly modern appearance that contrasted with Victorian ideals. The New Woman became associated with the rise of feminism and the campaign for women’s suffrage, as well as with the rise of consumerism, mass culture, and freer expressions of sexuality that defined the first decades of the 20th century. Emphasizing youth, mobility, freedom, and modernity, the image of the New Woman varied by age, class, race, ethnicity, and geographical region, offering a spectrum of behaviors and appearances with which different women could identify. At times controversial, the New Woman image provided women with opportunities to negotiate new social roles and to promote ideas of equality and freedom that would later become mainstream.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.
Nicole Etcheson and Cortney Cantrell
During the Civil War, the entire North constituted the homefront, an area largely removed from the din and horror of combat. With a few exceptions of raids and battles such as Gettysburg, civilians in the North experienced the war indirectly. The people on the homefront mobilized for war, sent their menfolk off to fight, supplied the soldiers and the army, coped without their breadwinners, and suffered the loss or maiming of men they loved. All the while, however, the homefront was crucially important to the course of the war. The mobilization of northern resources—not just men, but the manufacture of the arms and supplies needed to fight a war—enabled the North to conduct what some have called a total war, one on which the Union expended money and manpower at unprecedented levels. Confederate strategists hoped to break the will of the northern homefront to secure southern independence. Despite the hardships endured in the North, this strategy failed.
On the homefront, women struggled to provide for their families as well as to serve soldiers and the army by sending care packages and doing war work. Family letters reveal the impact of the war on children who lost their fathers either temporarily or permanently. Communities rallied to aid soldiers’ families but were riven by dissension over issues such as conscription and emancipation. Immigrants and African Americans sought a new place in U.S. society by exploiting the opportunities the war offered to prove their worth. Service in the Union army certainly advanced the status of some groups, but was not the only means to that end. Nuns who nursed the wounded improved the reputation of the Catholic Church and northern African Americans used the increasingly emancipationist war goals to improve their legal status in the North. The Civil War altered race relations most radically, but change came to everyone on the northern homefront.
The development of military arms harnessing nuclear energy for mass destruction has inspired continual efforts to control them. Since 1945, the United States, the Soviet Union, the United Kingdom, France, the People’s Republic of China (PRC), Israel, India, Pakistan, North Korea, and South Africa acquired control over these powerful weapons, though Pretoria dismantled its small cache in 1989 and Russia inherited the Soviet arsenal in 1996. Throughout this period, Washington sought to limit its nuclear forces in tandem with those of Moscow, prevent new states from fielding them, discourage their military use, and even permit their eventual abolition.
Scholars disagree about what explains the United States’ distinct approach to nuclear arms control. The history of U.S. nuclear policy treats intellectual theories and cultural attitudes alongside technical advances and strategic implications. The central debate is one of structure versus agency: whether the weapons’ sheer power, or historical actors’ attitudes toward that power, drove nuclear arms control. Among those who emphasize political responsibility, there are two further disagreements: (1) the relative influence of domestic protest, culture, and politics; and (2) whether U.S. nuclear arms control aimed first at securing the peace by regulating global nuclear forces or at bolstering American influence in the world.
The intensity of nuclear arms control efforts tended to rise or fall with the likelihood of nuclear war. Harry Truman’s faith in the country’s monopoly on nuclear weapons caused him to sabotage early initiatives, while Dwight Eisenhower’s belief in nuclear deterrence led in a similar direction. Fears of a U.S.-Soviet thermonuclear exchange mounted in the late 1950s, stoked by atmospheric nuclear testing and widespread radioactive fallout, which stirred protest movements and diplomatic initiatives. The spread of nuclear weapons to new states motivated U.S. presidents (John Kennedy in the vanguard) to mount a concerted campaign against “proliferation,” climaxing with the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon was exceptional. His reasons for signing the Strategic Arms Limitation Treaty (SALT I) and Anti-Ballistic Missile Treaty (ABM) with Moscow in 1972 were strategic: to buttress the country’s geopolitical position as U.S. armed forces withdrew from Southeast Asia. The rise of protest movements and Soviet economic difficulties after Ronald Reagan entered the Oval Office brought about two more landmark U.S.-Soviet accords—the 1987 Intermediate Ballistic Missile Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START)—the first occasions on which the superpowers eliminated nuclear weapons through treaty. The country’s attention swung to proliferation after the Soviet collapse in December 1991, as failed states, regional disputes, and non-state actors grew more prominent. Although controversies over Iraq, North Korea, and Iran’s nuclear programs have since erupted, Washington and Moscow continued to reduce their arsenals and refine their nuclear doctrines even as President Barack Obama proclaimed his support for a nuclear-free world.