101-120 of 205 Results  for:

  • 20th Century: Post-1945 x
Clear all

Article

Latinx Business and Entrepreneurship  

Pedro A. Regalado

Entrepreneurship has been a basic element of Latinx life in the United States since long before the nation’s founding, varying in scale and cutting across race, class, and gender to different degrees. Indigenous forms of commerce pre-dated Spanish contact in the Americas and continued thereafter. Beginning in the 16th century, the raising, trading, and production of cattle and cattle-related products became foundational to Spanish, Mexican, and later American Southwest society and culture. By the 19th century, Latinxs in US metropolitan areas began to establish enterprises in the form of storefronts, warehouses, factories, as well as smaller ventures including peddling. At times, they succeeded previous ethnic owners; in other moments, they established new businesses that shaped everyday life and politics of their respective communities. Whatever the scale of their ventures, Latinx business owners continued to capitalize on the migration of Latinx people to the United States from Latin America and the Caribbean during the 20th century. These entrepreneurs entered business for different reasons, often responding to restricted or constrained labor options, though many sought the flexibility that entrepreneurship offered. Despite an increasing association between Latinx people and entrepreneurship, profits from Latinx ventures produced uneven results during the second half of the 20th century. For some, finance and business ownership has generated immense wealth and political influence. For others at the margins of society, it has remained a tool for achieving sustenance amid the variability of a racially stratified labor market. No monolithic account can wholly capture the vastness and complexity of Latinx economic activity. Latinx business and entrepreneurship remains a vital piece of the place-making and politics of the US Latinx population. This article provides an overview of major trends and pivotal moments in its rich history.

Article

LGBTQ Politics in America since 1945  

Emily K. Hobson

Since World War II, the United States has witnessed major changes in lesbian, gay, bisexual, transgender, and queer (LGBTQ) politics. Indeed, because the history of LGBTQ activism is almost entirely concentrated in the postwar years, the LGBTQ movement is typically said to have achieved rapid change in a short period of time. But if popular accounts characterize LGBTQ history as a straightforward narrative of progress, the reality is more complex. Postwar LGBTQ politics has been both diverse and divided, marked by differences of identity and ideology. At the same time, LGBTQ politics has been embedded in the contexts of state-building and the Cold War, the New Left and the New Right, the growth of neoliberalism, and the HIV/AIDS epidemic. As the field of LGBTQ history has grown, scholars have increasingly been able to place analyses of state regulation into conversation with community-based histories. Moving between such outside and inside perspectives helps to reveal how multiple modes of LGBTQ politics have shaped one another and how they have been interwoven with broader social change. Looking from the outside, it is apparent that LGBTQ politics has been catalyzed by exclusions from citizenship; from the inside, we can see that activists have responded to such exclusions in different ways, including both by seeking social inclusion and by rejecting assimilationist terms. Court rulings and the administration of law have run alongside the debates inside activist communities. Competing visions for LGBTQ politics have centered around both leftist and liberal agendas, as well as viewpoints shaped by race, gender, gender expression, and class.

Article

Liberalism from the Fair Deal to the Great Society  

Jonathan Bell

In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state. Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.

Article

Lobbying and Business Associations  

Benjamin C. Waterhouse

Political lobbying has always played a key role in American governance, but the concept of paid influence peddling has been marked by a persistent tension throughout the country’s history. On the one hand, lobbying represents a democratic process by which citizens maintain open access to government. On the other, the outsized clout of certain groups engenders corruption and perpetuates inequality. The practice of lobbying itself has reflected broader social, political, and economic changes, particularly in the scope of state power and the scale of business organization. During the Gilded Age, associational activity flourished and lobbying became increasingly the province of organized trade associations. By the early 20th century, a wide range at political reforms worked to counter the political influence of corporations. Even after the Great Depression and New Deal recast the administrative and regulatory role of the federal government, business associations remained the primary vehicle through which corporations and their designated lobbyists influenced government policy. By the 1970s, corporate lobbyists had become more effective and better organized, and trade associations spurred a broad-based political mobilization of business. Business lobbying expanded in the latter decades of the 20th century; while the number of companies with a lobbying presence leveled off in the 1980s and 1990s, the number of lobbyists per company increased steadily and corporate lobbyists grew increasingly professionalized. A series of high-profile political scandals involving lobbyists in 2005 and 2006 sparked another effort at regulation. Yet despite popular disapproval of lobbying and distaste for politicians, efforts to substantially curtail the activities of lobbyists and trade associations did not achieve significant success.

Article

McCarthyism and the Second Red Scare  

Landon R. Y. Storrs

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States. The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.

Article

Memorializing Incarceration: The Japanese American Experience in World War II and Beyond  

Franklin Odo

On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.

Article

Mexican Americans in the United States  

Iliana Yamileth Rodriguez

Mexican American history in the United States spans centuries. In the 16th and 17th centuries, the Spanish Empire colonized North American territories. Though met with colonial rivalries in the southeast, Spanish control remained strong in the US southwest through the 19th century. The mid-1800s were an era of power struggles over territory and the construction of borders, which greatly impacted ethnic Mexicans living in the US-Mexico borderlands. After the Mexican-American War (1846–1848), the passage of the Treaty of Guadalupe Hidalgo allowed the United States to take all or parts of California, Arizona, Nevada, Utah, Colorado, and New Mexico. Ethnic Mexicans living in newly incorporated regions in the mid- through late 19th century witnessed the radical restructuring of their lives along legal, economic, political, and cultural lines. The early 20th century witnessed the rise of anti-Mexican sentiment and violence. As ethnic Mexican communities came under attack, Mexican Americans took leadership roles in institutions, labor unions, and community groups to fight for equality. Both tensions and coalition-building efforts between Mexican Americans and Mexican migrants animated the mid-20th century, as did questions about wartime identity, citizenship, and belonging. By the late 20th century, Chicana/o politics took center stage and brought forth a radical politics informed by the Mexican American experience. Finally, the late 20th through early 21st centuries saw further geographic diversification of Mexican American communities outside of the southwest.

Article

Military Assistance Programs since 1945  

Jeremy Kuzmarov

Military assistance programs have been crucial instruments of American foreign policy since World War II, valued by policymakers for combating internal subversion in the “free world,” deterring aggression, and protecting overseas interests. The 1958 Draper Committee, consisting of eight members of the Senate Foreign Relations Committee, concluded that economic and military assistance were interchangeable; as the committee put it, without internal security and the “feeling of confidence engendered by adequate military forces, there is little hope for economic progress.” Less explicitly, military assistance was also designed to uphold the U.S. global system of military bases established after World War II, ensure access to raw materials, and help recruit intelligence assets while keeping a light American footprint. Police and military aid was often invited and welcomed by government elites in so-called free world nations for enhancing domestic security or enabling the swift repression of political opponents. It sometimes coincided with an influx of economic aid, as under the Marshall Plan and Alliance for Progress. In cases like Vietnam, the programs contributed to stark human rights abuses owing to political circumstances and prioritizing national security over civil liberties.

Article

The Movement for Japanese American Redress  

Megan Asaka

The Japanese American Redress Movement refers to the various efforts of Japanese Americans from the 1940s to the 1980s to obtain restitution for their removal and confinement during World War II. This included judicial and legislative campaigns at local, state, and federal levels for recognition of government wrongdoing and compensation for losses, both material and immaterial. The push for redress originated in the late 1940s as the Cold War opened up opportunities for Japanese Americans to demand concessions from the government. During the 1960s and 1970s, Japanese Americans began to connect the struggle for redress with anti-racist and anti-imperialist movements of the time. Despite their growing political divisions, Japanese Americans came together to launch several successful campaigns that laid the groundwork for redress. During the early 1980s, the government increased its involvement in redress by forming a congressional commission to conduct an official review of the World War II incarceration. The commission’s recommendations of monetary payments and an official apology paved the way for the passage of the Civil Liberties Act of 1988 and other redress actions. Beyond its legislative and judicial victories, the redress movement also created a space for collective healing and generated new forms of activism that continue into the present.

Article

Municipal Housing in America  

Margaret Garb

Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people. Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness. Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.

Article

The National Parks  

Donald Worster

The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.

Article

Native People and American Film and TV  

Liza Black

Native people have appeared as characters in film and television in America from their inceptions. Throughout the 20th century, Native actors, writers, directors, and producers worked in the film and television industry. In terms of characterization, Native employment sits uncomfortable beside racist depictions of Native people. From the 1950s to the present, revisionist westerns come into being, giving the viewer a moral tale in which Native people are depicted with sympathy and white Americans are seen as aggressors. Today, a small but important group of Native actors in film and television work in limiting roles but turn in outstanding performances. Native directors, writers, and documentarians in the 1990s to the early 21st century have created critical interventions into media representations, telling stories from Indigenous viewpoints and bringing Native voices to the fore. The 2021 television show Rutherford Falls stands out as an example of Native writers gaining entry into the television studio system. Additionally, we have several Native film festivals in the early 21st century, and this trend continues to grow.

Article

NATO-US Relations  

Susan Colbourn

On April 4, 1949, twelve nations signed the North Atlantic Treaty: the United States, Canada, Iceland, the United Kingdom, Belgium, the Netherlands, Luxembourg, France, Portugal, Italy, Norway, and Denmark. For the United States, the North Atlantic Treaty signaled a major shift in foreign policy. Gone was the traditional aversion to “entangling alliances,” dating back to George Washington’s farewell address. The United States had entered into a collective security arrangement designed to preserve peace in Europe. With the creation of the North Atlantic Treaty Organization (NATO), the United States took on a clear leadership role on the European continent. Allied defense depended on US military power, most notably the nuclear umbrella. Reliance on the United States unsurprisingly created problems. Doubts about the strength of the transatlantic partnership and rumors of a NATO in shambles were (and are) commonplace, as were anxieties about the West’s strength in comparison to NATO’s Eastern counterpart, the Warsaw Pact. NATO, it turned out, was more than a Cold War institution. After the fall of the Berlin Wall and the collapse of the Soviet Union, the Alliance remained vital to US foreign policy objectives. The only invocation of Article V, the North Atlantic Treaty’s collective defense clause, came in the wake of the September 11, 2001 terrorist attacks. Over the last seven decades, NATO has symbolized both US power and its challenges.

Article

Neutrality/Nonalignment and the United States  

Robert Rakove

For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century. Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.

Article

The Nixon Administration and American Foreign Relations  

Luke A. Nichter

Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.

Article

Nuclear Arms Control in US Foreign Policy  

Jonathan Hunt

The development of military arms harnessing nuclear energy for mass destruction has inspired continual efforts to control them. Since 1945, the United States, the Soviet Union, the United Kingdom, France, the People’s Republic of China (PRC), Israel, India, Pakistan, North Korea, and South Africa acquired control over these powerful weapons, though Pretoria dismantled its small cache in 1989 and Russia inherited the Soviet arsenal in 1996. Throughout this period, Washington sought to limit its nuclear forces in tandem with those of Moscow, prevent new states from fielding them, discourage their military use, and even permit their eventual abolition. Scholars disagree about what explains the United States’ distinct approach to nuclear arms control. The history of U.S. nuclear policy treats intellectual theories and cultural attitudes alongside technical advances and strategic implications. The central debate is one of structure versus agency: whether the weapons’ sheer power, or historical actors’ attitudes toward that power, drove nuclear arms control. Among those who emphasize political responsibility, there are two further disagreements: (1) the relative influence of domestic protest, culture, and politics; and (2) whether U.S. nuclear arms control aimed first at securing the peace by regulating global nuclear forces or at bolstering American influence in the world. The intensity of nuclear arms control efforts tended to rise or fall with the likelihood of nuclear war. Harry Truman’s faith in the country’s monopoly on nuclear weapons caused him to sabotage early initiatives, while Dwight Eisenhower’s belief in nuclear deterrence led in a similar direction. Fears of a U.S.-Soviet thermonuclear exchange mounted in the late 1950s, stoked by atmospheric nuclear testing and widespread radioactive fallout, which stirred protest movements and diplomatic initiatives. The spread of nuclear weapons to new states motivated U.S. presidents (John Kennedy in the vanguard) to mount a concerted campaign against “proliferation,” climaxing with the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon was exceptional. His reasons for signing the Strategic Arms Limitation Treaty (SALT I) and Anti-Ballistic Missile Treaty (ABM) with Moscow in 1972 were strategic: to buttress the country’s geopolitical position as U.S. armed forces withdrew from Southeast Asia. The rise of protest movements and Soviet economic difficulties after Ronald Reagan entered the Oval Office brought about two more landmark U.S.-Soviet accords—the 1987 Intermediate Ballistic Missile Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START)—the first occasions on which the superpowers eliminated nuclear weapons through treaty. The country’s attention swung to proliferation after the Soviet collapse in December 1991, as failed states, regional disputes, and non-state actors grew more prominent. Although controversies over Iraq, North Korea, and Iran’s nuclear programs have since erupted, Washington and Moscow continued to reduce their arsenals and refine their nuclear doctrines even as President Barack Obama proclaimed his support for a nuclear-free world.

Article

OPEC, International Oil, and the United States  

Gregory Brew

After World War II, the United States backed multinational private oil companies known as the “Seven Sisters”—five American companies (including Standard Oil of New Jersey and Texaco), one British (British Petroleum), and one Anglo-Dutch (Shell)—in their efforts to control Middle East oil and feed rising demand for oil products in the West. In 1960 oil-producing states in Latin America and the Middle East formed the Organization of the Petroleum Exporting Countries (OPEC) to protest what they regarded as the inequitable dominance of the private oil companies. Between 1969 and 1973 changing geopolitical and economic conditions shifted the balance of power from the Seven Sisters to OPEC. Following the first “oil shock” of 1973–1974, OPEC assumed control over the production and price of oil, ending the rule of the companies and humbling the United States, which suddenly found itself dependent upon OPEC for its energy security. Yet this dependence was complicated by a close relationship between the United States and major oil producers such as Saudi Arabia, which continued to adopt pro-US strategic positions even as they squeezed out the companies. Following the Iranian Revolution (1978–1979), the Iran–Iraq War (1980–1988), and the First Iraq War (1990–1991), the antagonism that colored US relations with OPEC evolved into a more comfortable, if wary, recognition of the new normal, where OPEC supplied the United States with crude oil while acknowledging the United States’ role in maintaining the security of the international energy system.

Article

Organized Labor and the Civil Rights Movement  

Alan Draper

The relationship between organized labor and the civil rights movement proceeded along two tracks. At work, the two groups were adversaries, as civil rights groups criticized employment discrimination by the unions. But in politics, they allied. Unions and civil rights organizations partnered to support liberal legislation and to oppose conservative southern Democrats, who were as militant in opposing unions as they were fervent in supporting white supremacy. At work, unions dithered in their efforts to root out employment discrimination. Their initial enthusiasm for Title VII of the 1964 Civil Rights Act, which outlawed employment discrimination, waned the more the new law violated foundational union practices by infringing on the principle of seniority, emphasizing the rights of the individual over the group, and inserting the courts into the workplace. The two souls of postwar liberalism— labor solidarity represented by unions and racial justice represented by the civil rights movement—were in conflict at work. Although the unions and civil rights activists were adversaries over employment discrimination, they united in trying to register southern blacks to vote. Black enfranchisement would end the South’s exceptionalism and the veto it exercised over liberal legislation in Congress. But the two souls of liberalism that were at odds over the meaning of fairness at work would also diverge at the ballot box. As white workers began to defect from the Democratic Party, the political coalition of black and white workers that union leaders had hoped to build was undermined from below. The divergence between the two souls of liberalism in the 1960s—economic justice represented by unions and racial justice represented by civil rights—helps explain the resurgence of conservatism that followed.

Article

Origins of the Vietnam War  

Jessica M. Chapman

The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.

Article

The Peace Movement since 1945  

Leilah Danielson

Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations. Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics. Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.