Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films.
Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.
Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality.
Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture.
The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability.
Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.
On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.”
Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one.
The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.
Joshua L. Rosenbloom
The United States economy underwent major transformations between American independence and the Civil War through rapid population growth, the development of manufacturing, the onset of modern economic growth, increasing urbanization, the rapid spread of settlement into the trans-Appalachian west, and the rise of European immigration. These decades were also characterized by an increasing sectional conflict between free and slave states that culminated in 1861 in Southern secession from the Union and a bloody and destructive Civil War. Labor markets were central to each of these developments, directing the reallocation of labor between sectors and regions, channeling a growing population into productive employment, and shaping the growing North–South division within the country. Put differently, labor markets influenced the pace and character of economic development in the antebellum United States. On the one hand, the responsiveness of labor markets to economic shocks helped promote economic growth; on the other, imperfections in labor market responses to these shocks significantly affected the character and development of the national economy.
Stacy D. Fahrenthold
Between 1880 and 1924, an estimated half million Arab migrants left the Ottoman Empire to live and work in the Americas. Responding to new economic forces linking the Mediterranean and Atlantic capitalist economies to one another, Arab migrants entered the manufacturing industries of the settler societies they inhabited, including industrial textiles, small-scale commerce (peddling), heavy machining, and migrant services associated with continued immigration from the Middle East. The Ottoman Empire enacted few policies to halt emigration from Syria, Mount Lebanon, and Palestine, instead facilitating a remittance economy that enhanced the emerging cash economies of the Arab world. After 1920, the French Mandate in Syria and Lebanon moved to limit new migration to the Americas, working together with increasingly restrictive immigration regimes in the United States, Argentina, and Brazil to halt Arab labor immigration. Using informal archives, the Arab American press, and the records of diasporic mutual aid and philanthropic societies, new research in Arab American migration illustrates how migrants managed a transnational labor economy and confronted challenges presented by American nativism, travel restriction, and interwar deportations.
Since the introduction of “Fordism” in the early 1910s, which emphasized technological improvements and maximizing productive efficiency, US autoworkers have struggled with repetitive, exhausting, often dangerous jobs. Yet beginning with Ford’s Five Dollar Day, introduced in 1914, auto jobs have also provided higher pay than most other wage work, attracting hundreds of thousands of people, especially to Detroit, Michigan, through the 1920s, and again from World War II until the mid-1950s. Successful unionization campaigns by the United Auto Workers (UAW) in the 1930s and early 1940s resulted in contracts that guaranteed particular wage increases, reduced the power of foremen, and created a process for resolving workplace conflicts. In the late 1940s and early 1950s UAW president Walter Reuther negotiated generous medical benefits and pensions for autoworkers. The volatility of the auto industry, however, often brought layoffs that undermined economic security. By the 1950s overproduction and automation contributed heavily to instability for autoworkers. The UAW officially supported racial and gender equality, but realities in auto plants and the makeup of union leadership often belied those principles. Beginning in the 1970s US autoworkers faced disruptions caused by high oil prices, foreign competition, and outsourcing to Mexico. Contract concessions at unionized plants began in the late 1970s and continued into the 2000s. By the end of the 20th century, many American autoworkers did not belong to the UAW because they were employed by foreign automakers, who built factories in the United States and successfully opposed unionization. For good reason, autoworkers who survived the industry’s turbulence and were able to retire with guaranteed pensions and medical care look back fondly on all that they gained from working in the industry under UAW contracts. Countless others left auto work permanently and often reluctantly in periodic massive layoffs and the continuous loss of jobs from automation.
Ana Elizabeth Rosas
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.
In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.
James R. Barrett
The largest and most important revolutionary socialist organization in US history, the Communist Party USA was always a minority influence. It reached considerable size and influence, however, during the Great Depression and World War II years when it followed the more open line associated with the term “Popular Front.” In these years communists were much more flexible in their strategies and relations with other groups, though the party remained a hierarchical vanguard organization. It grew from a largely isolated sect dominated by unskilled and unemployed immigrant men in the 1920s to a socially diverse movement of nearly 100,000 based heavily on American born men and women from the working and professional classes by the late 1930s and during World War II, exerting considerable influence in the labor movement and American cultural life. In these years, the Communist Party helped to build the industrial union movement, advanced the cause of African American civil rights, and laid the foundation for the postwar feminist movement. But the party was always prone to abrupt changes in line and vulnerable to attack as a sinister outside force because of its close adherence to Soviet policies and goals. Several factors contributed to its catastrophic decline in the 1950s: the increasingly antagonistic Cold War struggle between the Soviet Union and the United States; an unprecedented attack from employers and government at various levels—criminal cases and imprisonment, deportation, and blacklisting; and within the party itself, a turn back toward a more dogmatic version of Marxism-Leninism and a heightened atmosphere of factional conflict and purges.
The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an industry central to both the U.S. and global economies as well as the making of the nation. For centuries, their work remained largely the same, involving brute manual labor in gangs; starting in the 1960s, however, their work was entirely remade due to technological transformation. Dockworkers possess a long history of militancy, resulting in dramatic improvements in their economic and workplace conditions. Today, nearly all are unionists, but dockworkers in ports along the Atlantic and Gulf coasts belong to the International Longshoremen’s Association (ILA), while the International Longshore and Warehouse Union (ILWU) represents them in Pacific Coast ports as well as in Hawaii and Alaska (along with British Columbia and Panama). In the mid-1930s, the ILA and ILWU became bitter rivals and remain so. This feud, which has cooled slightly since its outset, can be explained by differences in leadership, ideology, and tactics, with the ILA more craft-based, “patriotic,” and mainstream and the ILWU quite left wing, especially during its first few decades, and committed to fighting for racial equality. The existence of two unions complicates this story; in most countries, dockworkers belong to a single union. Similarly, America’s massive economy and physical size means that there are literally dozens of ports (again, unlike many other countries), making generalizations harder. Unfortunately, popular culture depictions of dockworkers inculcate unfair and incorrect notions that all dockworkers are involved with organized crime. Nevertheless, due to decades of militancy, strikes, and unionism, dockworkers in 21st-century America are—while far fewer in number—very well paid and still do important work, literally making world trade possible in an era when 90 percent of goods move by ship for at least part of their journey to market.
Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.
Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.
Adam J. Hodges
The first Red Scare, which occurred in 1919–1920, emerged out of longer clashes in the United States over the processes of industrialization, immigration, and urbanization as well as escalating conflict over the development of a labor movement challenging elite control of the economy. More immediately, the suppression of dissent during World War I and shock over a revolution in Russia that energized anti-capitalist radicals spurred further confrontations during an ill-planned postwar demobilization of the armed forces and economy.
A general strike in Seattle in February 1919 that grew out of wartime grievances among shipbuilders raised the specter of Bolshevik insurrection in the United States. National press attention fanned the flames and continued to do so throughout the year. In fact, 1919 became a record strike year. Massive coal and steel walkouts in the fall shook the industrial economy, while a work stoppage by Boston police became a national sensation and spread fears of a revolutionary breakdown in public order. Ultimately, however, much of the union militancy of the war era was crushed by the end of 1919 and the labor movement entered a period of retrenchment after 1922 that lasted until the 1930s.
Fall 1919 witnessed the creation of two competing Communist parties in the United States after months of press focus on bombs, riots, and strikes. Federal anti-radical investigative operations, which had grown enormously during World War I and continued into 1919, peaked in the so-called “Palmer Raids” of November 1919 and January 1920, named for US Attorney General A. Mitchell Palmer, who authorized them. The excesses of the Department of Justice and the decline of labor militancy caused a shift in press and public attention in 1920, though another Red Scare would escalate after World War II, with important continuities between the two.
Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages.
Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship.
In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.
Betsy A. Beasley
American cities have been transnational in nature since the first urban spaces emerged during the colonial period. Yet the specific shape of the relationship between American cities and the rest of the world has changed dramatically in the intervening years. In the mid-20th century, the increasing integration of the global economy within the American economy began to reshape US cities. In the Northeast and Midwest, the once robust manufacturing centers and factories that had sustained their residents—and their tax bases—left, first for the South and West, and then for cities and towns outside the United States, as capital grew more mobile and businesses sought lower wages and tax incentives elsewhere. That same global capital, combined with federal subsidies, created boomtowns in the once-rural South and West. Nationwide, city boosters began to pursue alternatives to heavy industry, once understood to be the undisputed guarantor of a healthy urban economy. Increasingly, US cities organized themselves around the service economy, both in high-end, white-collar sectors like finance, consulting, and education, and in low-end pink-collar and no-collar sectors like food service, hospitality, and health care. A new legal infrastructure related to immigration made US cities more racially, ethnically, and linguistically diverse than ever before.
At the same time, some US cities were agents of economic globalization themselves. Dubbed “global cities” by celebrants and critics of the new economy alike, these cities achieved power and prestige in the late 20th century not only because they had survived the ruptures of globalization but because they helped to determine its shape. By the end of the 20th century, cities that are not routinely listed among the “global city” elite jockeyed to claim “world-class” status, investing in high-end art, entertainment, technology, education, and health care amenities to attract and retain the high-income white-collar workers understood to be the last hope for cities hollowed out by deindustrialization and global competition. Today, the extreme differences between “global cities” and the rest of US cities, and the extreme socioeconomic stratification seen in cities of all stripes, is a key concern of urbanists.
Erik Gellman and Margaret Rung
From the late 1920s through the 1930s, countries on every inhabited continent suffered through a dramatic and wrenching economic contraction termed the Great Depression, an economic collapse that has come to represent the nadir of modern economic history. With national unemployment reaching well into double digits for over a decade, productivity levels falling by half, prices severely depressed, and millions of Americans without adequate food, shelter or clothing, the United States experienced some of the Great Depression’s severest consequences. The crisis left deep physical, psychological, political, social, and cultural impressions on the national landscape. It encouraged political reform and reaction, renewed labor activism, spurred migration, unleashed grass-roots movements, inspired cultural experimentation, and challenged family structures and gender roles.
The Haymarket Riot and Conspiracy of 1886 is a landmark in American social and political history. On May 4, 1886, during an open-air meeting near Haymarket Square in Chicago, someone threw a dynamite bomb into a squad of police, sparking a riot that resulted in the deaths of seven police officers and at least four rioters. Eight anarchists were brought to trial. Though the bomb-thrower was never apprehended, the eight radical leaders were charged as accessories before the fact for conspiring to murder the police. After the longest criminal trial in Illinois history up to that time, seven men were convicted and condemned to death and one to a long prison term. After all appeals were exhausted, four were executed, one cheated the hangman with a jail cell suicide, and the death sentences of two others were commuted to life imprisonment (all three incarcerated men were later pardoned by Governor John Peter Altgeld in 1892).
The Haymarket bombing and trial marked a pivotal moment in the history of American social movements. It sparked the nation’s first red scare whose fury disrupted even moderately leftist movements for a generation. It drove the nation’s labor unions onto a more conservative path than they had been heading before the bombing. The worldwide labor campaign for clemency for the convicted men became the foundation for the institution of International Workers’ Day on May 1, a holiday ironically observed in most countries except for the United States. It also began a tradition within the American left of memorializing the Haymarket defendants as the first martyrs to their cause.
Between the 1790s and the 1990s, the Irish American population grew from some 500,000 to nearly 40 million. Part of this growth was due to immigration, especially in the years of the Great Irish Famine, though significant emigration from Ireland both preceded and followed the famine decade of 1846–1855. For much of this 200-year period, Irish-born men and women and their descendants were heavily concentrated in working-class occupations and urban communities. Especially in the years around the opening of the 20th century, Irish Catholic immigrants and their descendants put a distinctive stamp on both the American labor movement and urban working-class culture and politics as a whole. Their outsized influence diminished somewhat over the course of the 20th century, but the American Irish continued to occupy key leadership positions in the U.S. labor movement, the Democratic Party, and the American Catholic Church, even as the working-class members or constituents of these institutions became increasingly ethnically diverse. The experience of Irish American working people thus constitutes an important dimension of a larger story—that of the American working class as a whole.
The US Catholic Church was for most of its history—and, in many places, still is—a working-class church. The choice for worship by successive waves of immigrants, from the Irish to the Polish to the Mexican, the Church, once it had created an institutional presence, welcomed “these strangers in a strange land.” These immigrants play a major role in creating and sustaining parishes that served both as a soul-sustaining refuge and, in many cases, a way station to the outside world. James Cardinal Gibbons, having learned from the central role that Irish workers played in the Knights of Labor and protests against the excommunication of the radical New York priest, Edward McGlynn, persuaded the Vatican to take a relatively liberal stance toward the “social question” in the United States. Rerum Novarum, the 1891 papal encyclical, condemned socialism and competitive capitalism, but more significantly asserted the “natural” right of workers to form unions as well as to have a living wage. It was within this religious legitimation of unionism that Irish Catholics came to prominence in the American Federation of Labor, that Monsignor John A. Ryan created a US Catholic social justice intellectual tradition, and that US bishops adopted the 1919 Program for Social Reconstruction. The Catholic labor moment came when the Church, led by the National Catholic Welfare Conference’s Social Action Department, midwestern bishops, and labor priests, not only supported the Congress of Industrial Organizations (CIO), but consistently pushed the New Deal to implement the 1919 program. Philip Murray, the CIO’s Catholic president, led the expulsion of the Communist-led unions when the Communist Party, in the Wallace campaign, threatened both the country and everything the CIO had built. On the one hand, this Catholic labor moment dissolved in an overdetermined mixture of complacency, capitalist growth, and anti-Communism. On the other, a direct line can be traced from California’s labor priests to the Spanish Mission Band to Cesar Chavez and the formation of the United Farm Workers. It took time for the official Church to support the farm workers, but once that happened, it was all in: the support the Church, at all levels, gave them far exceeded anything it had done previously to implement Rerum Novarum.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.