1-20 of 33 Results  for:

  • Labor and Working Class History x
  • 20th Century: Post-1945 x
Clear all

Article

American Film since 1945  

Joshua Gleich

Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films. Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.

Article

America’s Wars on Poverty and the Building of the Welfare State  

David Torstensson

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.” Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one. The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.

Article

Appalachian War on Poverty and the Working Class  

Jessica Wilkerson

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.

Article

Asian American Activism  

Vivian Truong

Activism is a defining element of Asian American history. Throughout most of their presence in the United States, Asian Americans have engaged in organized resistance even in the face of violent exclusion and repression. These long histories of activism challenge prevailing notions of the political silence of Asian Americans, which have persisted since the rise of the model minority narrative in the mid-20th century. Examining Asian American history through the lens of activism shows how Asian Americans were not simply acted upon, but were agents in forging their own histories. In the century after the first substantial waves of migration in the 1850s, Asian Americans protested labor conditions, fought for full citizenship rights, and led efforts to liberate their homelands from colonial rule. Activism has been a key part of determining who Asian Americans are—indeed, the term “Asian American” itself was coined in the 1960s as a radical political identity in a movement against racism and imperialism. In the decades since the Asian American movement, “Asian America” has become larger and more diverse. Contemporary Asian American activism reflects the expansiveness and heterogeneity of Asian American communities.

Article

The Bracero Program/“Guest Worker” Programs  

Ana Elizabeth Rosas

On August 4, 1942, the Mexican and US governments launched the binational guest worker program most commonly known as the Bracero Program. An estimated 5 million Mexican men between the ages of nineteen and forty-five separated from their families for three to nine-month cycles at a time, depending on the duration of their labor contract, in anticipation of earning the prevailing US wage this program had promised them. They labored in US agriculture, railroad construction, and forestry with hardly any employment protections or rights in place to support themselves or the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting this goal. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for fragile transnational family relationships. The Bracero Program grew over the course of its twenty-two-year existence, and despite its negative consequences, Mexican men and their families could not afford to settle for unemployment in Mexico nor pass up US employment opportunities of any sort. The Mexican and US governments’ persistently negligent management of the program coupled with their conveniently selective acknowledgment of the severity of the plight of Mexican women and men consistently required Mexican men and their families to shoulder the full extent of the program’s exploitative conditions and terms.

Article

Cesar Chavez and the United Farm Workers Movement  

Matt Garcia

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Child Migrants in 20th-Century America  

Ivón Padilla-Rodríguez

Child migration has garnered widespread media coverage in the 21st century, becoming a central topic of national political discourse and immigration policymaking. Contemporary surges of child migrants are part of a much longer history of migration to the United States. In the first half of the 20th century, millions of European and Asian child migrants passed through immigration inspection stations in the New York harbor and San Francisco Bay. Even though some accompanied and unaccompanied European child migrants experienced detention at Ellis Island, most were processed and admitted into the United States fairly quickly in the early 20th century. Few of the European child migrants were deported from Ellis Island. Predominantly accompanied Chinese and Japanese child migrants, however, like Latin American and Caribbean migrants in recent years, were more frequently subjected to family separation, abuse, detention, and deportation at Angel Island. Once inside the United States, both European and Asian children struggled to overcome poverty, labor exploitation, educational inequity, the attitudes of hostile officials, and public health problems. After World War II, Korean refugee “orphans” came to the United States under the Refugee Relief Act of 1953 and the Immigration and Nationality Act. European, Cuban, and Indochinese refugee children were admitted into the United States through a series of ad hoc programs and temporary legislation until the 1980 Refugee Act created a permanent mechanism for the admission of refugee and unaccompanied children. Exclusionary immigration laws, the hardening of US international boundaries, and the United States preference for refugees who fled Communist regimes made unlawful entry the only option for thousands of accompanied and unaccompanied Mexican, Central American, and Haitian children in the second half of the 20th century. Black and brown migrant and asylum-seeking children were forced to endure educational deprivation, labor trafficking, mandatory detention, deportation, and deadly abuse by US authorities and employers at US borders and inside the country.

Article

Cold War in the American Working Class  

Rosemary Feurer

The US working class and the institutional labor movement was shaped by anticommunism. Anticommunism preceded the founding of the Soviet Union and the Cold War, and this early history affected the later experience. It reinforced conservative positions on union issues even in the period before the Cold War, and forged the alliances that influenced the labor movement’s direction, including the campaign to organize the South, the methods and structures of unions, and US labor’s foreign policy positions. While the Communist Party of the USA (CP) was a hierarchical organization straitjacketed by an allegiance to the Soviet Union, the unions it fostered cultivated radical democratic methods, while anticommunism often justified opposition to militancy and obstructed progressive policies. In the hottest moments of the postwar development of domestic anticommunism, unions and their members were vilified and purged from the labor movement, forced to take loyalty oaths, and fired for their association with the CP. The Cold War in the working class removed critical perspectives on capitalism, reinforced a moderate and conservative labor officialdom, and led to conformity with the state on foreign policy issues.

Article

Dallas  

Patricia Evridge Hill

From its origins in the 1840s, Dallas developed quickly into a prosperous market town. After acquiring two railroads in the 1870s, the city became the commercial and financial center of North Central Texas. Early urban development featured competition and cooperation between the city’s business leadership, women’s groups, and coalitions formed by Populists, socialists, and organized labor. Notably, the city’s African Americans were marginalized economically and excluded from civic affairs. By the end of the 1930s, city building became more exclusive even for the white population. A new generation of business leaders threatened by disputes over Progressive Era social reforms and city planning, the revival of the Ku Klux Klan, and attempts to organize industrial workers used its control of local media, at-large elections, and repression to dominate civic affairs until the 1970s.

Article

The Department Store  

Traci Parker

Department stores were the epicenter of American consumption and modernity in the late 19th and through the 20th century. Between 1846 and 1860 store merchants and commercial impresarios remade dry goods stores and small apparel shops into department stores—downtown emporiums that departmentalized its vast inventory and offered copious services and amenities. Their ascendance corresponded with increased urbanization, immigration, industrialization, and the mass production of machine-made wares. Urbanization and industrialization also helped to birth a new White middle class who were eager to spend their money on material comforts and leisure activities. And department stores provided them with a place where they could do so. Stores sold shoppers an astounding array of high-quality, stylish merchandise including clothing, furniture, radios, sporting equipment, musical instruments, luggage, silverware, china, and books. They also provided an array of services and amenities, including public telephones, postal services, shopping assistance, free delivery, telephone-order and mail-order departments, barber shops, hair salons, hospitals and dental offices, radio departments, shoe-shining stands, wedding gift registries and wedding secretary services, tearooms, and restaurants. Stores enthroned consumption as the route to democracy and citizenship, inviting everybody—regardless of race, gender, age, and class—to enter, browse, and purchase material goods. They were major employers of white-collar workers and functioned as a new public space for women as workers and consumers. The 20th century brought rapid and significant changes and challenges. Department stores weathered economic crises; two world wars; new and intense competition from neighborhood, chain, and discount stores; and labor and civil rights protests that threatened to damage their image and displace them as the nation’s top retailers. They experienced cutbacks, consolidated services, and declining sales during the Great Depression, played an essential role in the war effort, and contended with the Office of Price Administration’s Emergency Price Control Act during the Second World War. In the postwar era, they opened branch locations in suburban neighborhoods where their preferred clientele—the White middle class—now resided and shaped the development and proliferation of shopping centers. They hastened the decline of downtown shopping as a result. The last three decades of the 20th century witnessed a wave of department store closures, mergers, and acquisitions because of changing consumer behaviors, shifts in the retail landscape, and evolving market dynamics. Department stores would continue to suffer into the 21st century as online retailing exploded.

Article

Employers’ Associations and Open Shops in the United States  

Chad Pearson

Employers began organizing with one another to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops, workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the 20th century’s first three decades, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was mostly successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right-to-work,” insisting that individual laborers must enjoy freedom from so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, begin passing “right-to-work” laws, which made union organizing more difficult because workers were not obligated to join unions or pay their “fair share” of dues to them. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Article

Gender Rights and American Employment  

Katherine Turk

Throughout American history, gender, meaning notions of essential differences between women and men, has shaped how Americans have defined and engaged in productive activity. Work has been a key site where gendered inequalities have been produced, but work has also been a crucible for rights claims that have challenged those inequalities. Federal and state governments long played a central role in generating and upholding gendered policy. Workers and advocates have debated whether to advance laboring women’s cause by demanding equality with men or different treatment that accounted for women’s distinct responsibilities and disadvantages. Beginning in the colonial period, constructions of dependence and independence derived from the heterosexual nuclear family underscored a gendered division of labor that assigned distinct tasks to the sexes, albeit varied by race and class. In the 19th century, gendered expectations shaped all workers’ experiences of the Industrial Revolution, slavery and its abolition, and the ideology of free labor. Early 20th-century reform movements sought to beat back the excesses of industrial capitalism by defining the sexes against each other, demanding protective labor laws for white women while framing work done by women of color and men as properly unregulated. Policymakers reinforced this framework in the 1930s as they built a welfare state that was rooted in gendered and racialized constructions of citizenship. In the second half of the 20th century, labor rights claims that reasoned from the sexes’ distinctiveness increasingly gave way to assertions of sex equality, even as the meaning of that equality was contested. As the sex equality paradigm triumphed in the late 20th and early 21st centuries, seismic economic shifts and a conservative business climate narrowed the potential of sex equality laws to deliver substantive changes to workers.

Article

Industrial Workers of the World  

Peter Cole

Perhaps the most important radical labor union in U.S. history, the Industrial Workers of the World (IWW) continues to attract workers, in and beyond the United States. The IWW was founded in 1905 in Chicago—at that time, the greatest industrial city in a country that had become the world’s mightiest economy. Due to the nature of industrial capitalism in what, already, had become a global economy, the IWW and its ideals quickly became a worldwide phenomenon. The Wobblies, as members were and still are affectionately known, never were as numerically large as mainstream unions, but their influence, particularly from 1905 into the 1920s, was enormous. The IWW captured the imaginations of countless rebellious workers with its fiery rhetoric, daring tactics, and commitment to revolutionary industrial unionism. The IWW pledged to replace the “bread and butter” craft unionism of the larger, more mainstream American Federation of Labor (AFL), with massive industrial unions strong enough to take on ever-larger corporations and, ultimately, overthrow capitalism to be replaced with a society based upon people rather than profit. In the United States, the union grew in numbers and reputation, before and during World War I, by organizing workers neglected by other unions—immigrant factory workers in the Northeast and Midwest, migratory farmworkers in the Great Plains, and mine, timber, and harvest workers out West. Unlike most other unions of that era, the IWW welcomed immigrants, women, and people of color; truly, most U.S. institutions excluded African Americans and darker-skinned immigrants as well as women, making the IWW among the most radically inclusive institutions in the country and world. Wobbly ideas, members, and publications soon spread beyond the United States—first to Mexico and Canada, then into the Caribbean and Latin America, and to Europe, southern Africa, and Australasia in rapid succession. The expansion of the IWW and its ideals across the world in under a decade is a testament to the passionate commitment of its members. It also speaks to the immense popularity of anticapitalist tendencies that shared more in common with anarchism than social democracy. However, the IWW’s revolutionary program and class-war rhetoric yielded more enemies than allies, including governments, which proved devastating during and after World War I, though the union soldiered on. Even in 2020, the ideals the IWW espoused continued to resonate among a small but growing and vibrant group of workers, worldwide.

Article

Labor and Black Power  

Austin McCoy

From the early 1960s through the 1970s, Black workers in various economic sectors organized and were inspired by Black Power principles such as community control, self-determination, and racial solidarity. This Black Power unionism utilized an array of strategies and tactics, ranging from direct action and radical class struggle to negotiation and lawsuits, to combat racial discrimination in employment. Black workers in sectors such as construction and auto and steel industries also utilized strikes, shutdowns, and other forms of protest to combat the intransigence of labor unions that failed to address segregation at the workplace, poor treatment of Black workers, and seniority policies that made work more precarious for them. While Black Power unionism enjoyed some successes—albeit often incomplete—their efforts to enact “affirmative action from below” encountered stiff opposition from employers and unions in the context of the economic and political crises of the 1970s. Ultimately, Black Power unionism exposed the limits of post-Jim Crow desegregation policy in US racial capitalism. Black Power unionism was a political movement that was as salient for Black workers as the Black Panther Party. Although its achievements were limited, its influence far outlived the Black Panther Party itself.

Article

Labor and Unions since 1960  

Erik Loomis

The American labor movement has declined significantly since 1960. Once a powerful part of American life, bringing economic democracy to the nation, organized labor has become a shell of itself, with numbers far lower than a half-century ago. The 1960s began with a powerful movement divided on race but also deeply influenced by the civil rights movement. Deindustrialization and capital mobility cut into labor’s power after 1965 as factories closed. The rise of public sector unionism in the 1970s briefly gave labor new power, but private sector unions faced enormous internal dissension throughout that decade. The Reagan administration ushered in a new era of warfare against organized labor when the president fired the striking air traffic controllers in 1981. Soon, private sector employers engaged in brutal anti-union campaigns. Reforms within labor in the 1990s sought to renew the movement’s long tradition of organizing, but with mixed success at best. Since the 1980s, we have seen more attacks on organized labor, especially Republican-led campaigns against public sector union rights beginning in 2011 that culminated in the 2019 Supreme Court ruling that declared required dues for non-union members unconstitutional. Labor’s decline has led to a new era of income inequality but also brought a stronger class-centric politics back into American life as everyday people seek new answers to the tenuousness of their economic lives.

Article

Labor Day and the American Working Class  

Donna T. Haverty-Stacke

The first Labor Day parade was held on September 5, 1882, in New York City. It, and the annual holiday demonstrations that followed in that decade and the next, resulted from the growth of the modern organized labor movement that took place in the context of the second industrial revolution. These first Labor Day celebrations also became part of the then ongoing ideological and tactical divisions within that movement. By the early 1900s, workers’ desire to enjoy the fruits of their labor by participating in popular leisure pursuits came to characterize the day. But union leaders, who considered such leisure pursuits a distraction from displays of union solidarity, continued to encourage the organization of parades. With the protections afforded to organized labor by the New Deal, and with the gains made during and after World War II (particularly among unionized white, male, industrial laborers), Labor Day parades declined further after 1945 as workers enjoyed access to mass cultural pursuits, increasingly in suburban settings. This decline was indicative of a broader loss of union movement culture that had served to build solidarity within unions, display working-class militancy to employers, and communicate the legitimacy of organized labor to the American public. From time to time since the late 1970s unions have attempted to reclaim the power of Labor Day to make concerted demands through their display of workers’ united power; but, for most Americans the holiday has become part of a three-day weekend devoted to shopping or leisure that marks the end of the summer season.

Article

Latino Labor in the US Food Industry, 1880–2020  

Lori A. Flores

If one considers all the links in the food chain—from crop cultivation to harvesting to processing to transportation to provision and service—millions of workers are required to get food from fields and farms to our grocery stores, restaurants, and kitchen tables. One out of every seven workers in the United States performs a job related in some way to food, whether it is in direct on-farm employment, in stores, in eating/drinking establishments, or in other agriculture-related sectors. According to demographic breakdowns of US food labor, people of color and immigrants (of varying legal and citizenship statuses) hold the majority of low-wage jobs in the US food system. Since the late 19th century Latinos (people of Latin American descent living in the United States) have played a tremendous role in powering the nation’s food industry. In the Southwest, Mexicans and Mexican Americans have historically worked as farmworkers, street vendors, restaurateurs, and employees in food factories. The Bracero Program (1942–1964) only strengthened the pattern of hiring Latinos as food workers by importing a steady stream of Mexican guest workers into fields, orchards, and vineyards across all regions of the United States. Meanwhile, mid-20th-century Puerto Rican agricultural guest workers served the farms and food processing factories of the Midwest and East Coast. In the late 20th and early 21st centuries, Central American food labor has become more noticeable in restaurants, the meat and seafood industries, and street food vending. It is deeply ironic, then, that the workers who help to nourish us and get our food to us go so unnourished themselves. Across the board, food laborers lack many privileges and basic rights. There is still no federal minimum wage for the almost three million farmworkers who labor in the nation’s fruit orchards, vineyards, and vegetable fields. Farmworkers (who are overwhelmingly Latino and undocumented) earn very low wages and face various health risks from pesticide exposure, extreme weather, a lack of nutritious, affordable food and potable water, substandard and unsanitary housing conditions, workplace abuse, unsafe transportation, and sexual harassment and assault. Other kinds of food workers—such as restaurant workers and street vendors—experience similar economic precarity and physical/social invisibility. While many of these substandard conditions exist because of employer decisions about costs and the treatment of their workers, American consumers seeking the lowest prices for food are also caught up in this cycle of exploitation. In efforts to stay competitive and profitable in what they give to grocery stores, restaurants, and the American public, farmers and food distributors trim costs wherever they can, which often negatively impacts the wages and conditions of those who are working the hardest at the bottom of the national food chain. To push back against these forms of exploitation, food entrepreneurs, worker unions, and other advocates have vocally supported Latinos in the US food industry and tried to address problems ranging from xenophobia to human trafficking.

Article

Latinx Business and Entrepreneurship  

Pedro A. Regalado

Entrepreneurship has been a basic element of Latinx life in the United States since long before the nation’s founding, varying in scale and cutting across race, class, and gender to different degrees. Indigenous forms of commerce pre-dated Spanish contact in the Americas and continued thereafter. Beginning in the 16th century, the raising, trading, and production of cattle and cattle-related products became foundational to Spanish, Mexican, and later American Southwest society and culture. By the 19th century, Latinxs in US metropolitan areas began to establish enterprises in the form of storefronts, warehouses, factories, as well as smaller ventures including peddling. At times, they succeeded previous ethnic owners; in other moments, they established new businesses that shaped everyday life and politics of their respective communities. Whatever the scale of their ventures, Latinx business owners continued to capitalize on the migration of Latinx people to the United States from Latin America and the Caribbean during the 20th century. These entrepreneurs entered business for different reasons, often responding to restricted or constrained labor options, though many sought the flexibility that entrepreneurship offered. Despite an increasing association between Latinx people and entrepreneurship, profits from Latinx ventures produced uneven results during the second half of the 20th century. For some, finance and business ownership has generated immense wealth and political influence. For others at the margins of society, it has remained a tool for achieving sustenance amid the variability of a racially stratified labor market. No monolithic account can wholly capture the vastness and complexity of Latinx economic activity. Latinx business and entrepreneurship remains a vital piece of the place-making and politics of the US Latinx population. This article provides an overview of major trends and pivotal moments in its rich history.

Article

McCarthyism and the Second Red Scare  

Landon R. Y. Storrs

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States. The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.

Article

Municipal Housing in America  

Margaret Garb

Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people. Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness. Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.