During the Holocene, the present geological epoch, an increasing portion of humans began to manipulate the reproduction of plants and animals in a series of environmental practices known as agriculture. No other ecological relationship sustains as many humans as farming; no other has transformed the landscape to the same extent. The domestication of plants by American Indians followed the end of the last glacial maximum (the Ice Age). About eight thousand years ago, the first domesticated maize and squash arrived from central Mexico, spreading to every region and as far north as the subarctic boreal forest. The incursion of Europeans into North America set off widespread deforestation, soil depletion, and the spread of settlement, followed by the introduction of industrial machines and chemicals. A series of institutions sponsored publically funded research into fertilizers and insecticides. By the late 19th century, writers and activists criticized the technological transformation of farming as destructive to the environment and rural society. During the 20th century, wind erosion contributed to the depopulation of much of the Great Plains. Vast projects in environmental engineering transformed deserts into highly productive regions of intensive fruit and vegetable production. Throughout much of the 19th and 20th centuries, access to land remained limited to whites, with American Indians, African Americans, Latinas/os, Chinese, and peoples of other ethnicities attempting to gain farms or hold on to the land they owned.
Two broad periods describe the history of agriculture and the environment in that portion of North America that became the United States. In the first, the environment dominated, forcing humans to adapt during the end of thousands of years of extreme climate variability. In the second, institutional and technological change became more significant, though the environment remained a constant factor against which American agriculture took shape. A related historical pattern within this shift was the capitalist transformation of the United States. For thousands of years, households sustained themselves and exchanged some of what they produced for money. But during the 19th century among a majority of American farmers, commodities took over the entire purpose of agriculture, transforming environments to reflect commercial opportunity.
Michael C. C. Adams
On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history.
Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.
The first forty years of cinema in the United States, from the development and commercialization of modern motion picture technology in the mid-1890s to the full blossoming of sound-era Hollywood during the early 1930s, represents one of the most consequential periods in the history of the medium. It was a time of tremendous artistic and economic transformation, including but not limited to the storied transition from silent motion pictures to “the talkies” in the late 1920s.
Though the nomenclature of the silent era implies a relatively unified period in film history, the years before the transition to sound saw a succession of important changes in film artistry and its means of production, and film historians generally regard the epoch as divided into at least three separate and largely distinct temporalities. During the period of early cinema, which lasted about a decade from the medium’s emergence in the mid-1890s through the middle years of the new century’s first decade, motion pictures existed primarily as a novelty amusement presented in vaudeville theatres and carnival fairgrounds. Film historians Tom Gunning and André Gaudreault have famously defined the aesthetic of this period as a “cinema of attractions,” in which the technology of recording and reproducing the world, along with the new ways in which it could frame, orient, and manipulate time and space, marked the primary concerns of the medium’s artists and spectators.
A transitional period followed from around 1907 to the later 1910s when changes in the distribution model for motion pictures enabled the development of purpose-built exhibition halls and led to a marked increase in demand for the entertainment. On a formal and artistic level, the period saw a rise in the prominence of the story film and widespread experimentation with new techniques of cinematography and editing, many of which would become foundational to later cinematic style. The era also witnessed the introduction and growing prominence of feature-length filmmaking over narrative shorts. The production side was marked by intensifying competition between the original American motion picture studios based in and around New York City, several of which attempted to cement their influence by forming an oligopolistic trust, and a number of upstart “independent” West Coast studios located around Los Angeles.
Both the artistic and production trends of the transitional period came to a head during the classical era that followed, when the visual experimentation of the previous years consolidated into the “classical style” favored by the major studios, and the competition between East Coast and West Coast studios resolved definitively in favor of the latter. This was the era of Hollywood’s ascendance over domestic filmmaking in the United States and its growing influence over worldwide film markets, due in part to the decimation of the European film industry during World War I. After nearly a decade of dominance, the Hollywood studio system was so refined that the advent of marketable synchronized sound technology around 1927 produced relatively few upheavals among the coterie of top studios. Rather, the American film industry managed to reorient itself around the production of talking motion pictures so swiftly that silent film production in the United States had effectively ceased at any appreciable scale by 1929.
Artistically, the early years of “the talkies” proved challenging, as filmmakers struggled with the imperfections of early recording technology and the limitations they imposed on filmmaking practice. But filmgoing remained popular in the United States even during the depths of the Great Depression, and by the early 1930s a combination of improved technology and artistic adaptation led to such a marked increase in quality that many film historians regard the period to be the beginning of Hollywood’s Golden Era. With a new voluntary production code put in place to respond to criticism of immorality in Hollywood fare, the American film industry was poised by the early 1930s to solidify its prominent position in American cultural life.
Helen Zoe Veit
The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease.
American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality.
Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture.
The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability.
Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.
The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.
Michael A. Krysko
Radio debuted as a wireless alternative to telegraphy in the late 19th century. At its inception, wireless technology could only transmit signals and was incapable of broadcasting actual voices. During the 1920s, however, it transformed into a medium primarily identified as one used for entertainment and informational broadcasting. The commercialization of American broadcasting, which included the establishment of national networks and reliance on advertising to generate revenue, became the so-called American system of broadcasting. This transformation demonstrates how technology is shaped by the dynamic forces of the society in which it is embedded. Broadcasting’s aural attributes also engaged listeners in a way that distinguished it from other forms of mass media. Cognitive processes triggered by the disembodied voices and sounds emanating from radio’s loudspeakers illustrate how listeners, grounded in particular social, cultural, economic, and political contexts, made sense of and understood the content with which they were engaged. Through the 1940s, difficulties in expanding the international radio presence of the United States further highlight the significance of surrounding contexts in shaping the technology and in promoting (or discouraging) listener engagement with programing content.
As places of dense habitation, cities have always required coordination and planning. City planning has involved the design and construction of large-scale infrastructure projects to provide basic necessities such as a water supply and drainage. By the 1850s, immigration and industrialization were fueling the rise of big cities, creating immense, collective problems of epidemics, slums, pollution, gridlock, and crime. From the 1850s to the 1900s, both local governments and utility companies responded to this explosive physical and demographic growth by constructing a “networked city” of modern technologies such as gaslight, telephones, and electricity. Building the urban environment also became a wellspring of innovation in science, medicine, and administration. In 1909–1910, a revolutionary idea—comprehensive city planning—opened a new era of professionalization and institutionalization in the planning departments of city halls and universities. Over the next thirty-five years, however, wars and depression limited their influence.
From 1945 to 1965, in contrast, represents the golden age of formal planning. During this unprecedented period of peace and prosperity, academically trained experts played central roles in the modernization of the inner cities and the sprawl of the suburbs. But the planners’ clean-sweep approach to urban renewal and the massive destruction caused by highway construction provoked a revolt of the grassroots. Beginning in the Watts district of Los Angeles in 1965, mass uprisings escalated over the next three years into a national crisis of social disorder, racial and ethnic inequality, and environmental injustice. The postwar consensus of theory and practice was shattered, replaced by a fragmented profession ranging from defenders of top-down systems of computer-generated simulations to proponents of advocacy planning from the bottom up. Since the late 1980s, the ascendency of public-private partnerships in building the urban environment has favored the planners promoting systems approaches, who promise a future of high-tech “smart cities” under their complete control.
Judy Yung and Erika Lee
The Angel Island Immigration Station (1910–1940), located in San Francisco Bay, was one of twenty-four ports of entry established by the U.S. government to process and detain immigrants entering and leaving the country. Although popularly called the “Ellis Island of the West,” the Angel Island station was in fact quite different from its counterpart in New York. Ellis Island was built in 1892 to welcome European immigrants and to enforce immigration laws that restricted but did not exclude European immigrants. In contrast, as the primary gateway for Chinese and other Asian immigrants, the Angel Island station was built in 1910 to better enforce discriminatory immigration policies that targeted Asians for exclusion. Chinese immigrants, in particular, were subjected to longer physical exams, interrogations, and detentions than any other immigrant group. Out of frustration, anger, and despair, many of them wrote and carved Chinese poems into the barrack walls. In 1940, a fire destroyed the administration building, and the immigration station was moved back to San Francisco. In 1963, the abandoned site became part of the state park system, and the remaining buildings were slated for demolition. Thanks to the collective efforts of Asian American activists and descendents of former detainees, the U.S. Immigration Station at Angel Island was designated a National Historic Landmark in 1997, and the immigration site, including the Chinese poetry on the barrack walls, was preserved and transformed into a museum of Pacific immigration for visitors.
Anna May Wong (January 3, 1905–February 3, 1961) was the first Chinese American movie star and the first Asian American actress to gain international recognition. Wong broke the codes of yellowface in both American and European cinema to become one of the major global actresses of Asian descent between the world wars. She made close to sixty films that circulated around the world and in 1951 starred in her own television show, The Gallery of Madame Liu-Tsong, produced by the defunct Dumont Network. Examining Wong’s career is particularly fruitful because of race’s centrality to the motion pictures’ construction of the modern American nation-state, as well as its significance within the global circulation of moving images.
Born near Los Angeles’s Chinatown, Wong began acting in films at an early age. During the silent era, she starred in films such as The Toll of the Sea (1922), one of the first two-tone Technicolor films, and The Thief of Baghdad (1924). Frustrated by Hollywood roles, Wong left for Europe in the late 1920s, where she starred in several films and plays, including Piccadilly (1929) and A Circle of Chalk (1929) opposite Laurence Olivier. Wong traveled between the United States and Europe for film and stage work. In 1935 she protested Metro-Goldwyn-Mayer’s refusal to consider her for the leading role of O-Lan in the Academy Award–winning film The Good Earth (1937). Wong then paid her one and only visit to China. In the late 1930s, she starred in several B films such as King of Chinatown (1939), graced the cover of the mass-circulating American magazine Look, and traveled to Australia. In 1961, Wong died of Laennec’s cirrhosis, a disease typically stemming from alcoholism. Yet, as her legacy shows, for a brief moment a glamorous Chinese American woman occupied a position of transnational importance.
Stacy D. Fahrenthold
Between 1880 and 1924, an estimated half million Arab migrants left the Ottoman Empire to live and work in the Americas. Responding to new economic forces linking the Mediterranean and Atlantic capitalist economies to one another, Arab migrants entered the manufacturing industries of the settler societies they inhabited, including industrial textiles, small-scale commerce (peddling), heavy machining, and migrant services associated with continued immigration from the Middle East. The Ottoman Empire enacted few policies to halt emigration from Syria, Mount Lebanon, and Palestine, instead facilitating a remittance economy that enhanced the emerging cash economies of the Arab world. After 1920, the French Mandate in Syria and Lebanon moved to limit new migration to the Americas, working together with increasingly restrictive immigration regimes in the United States, Argentina, and Brazil to halt Arab labor immigration. Using informal archives, the Arab American press, and the records of diasporic mutual aid and philanthropic societies, new research in Arab American migration illustrates how migrants managed a transnational labor economy and confronted challenges presented by American nativism, travel restriction, and interwar deportations.
Akram Fouad Khater
Between 1880 and 1940, more than 130,000 Arabs immigrated to the United States as part of the Great Migration of the long 19th century. They lived and worked across the breadth of the United States, fought its many wars, and were engaged in the transformative debates about labor, race, gender, and citizenship that raged throughout this time period. As they struggled to carve out a place in “Amirka” they encountered and fought efforts to racialize them as the uncivilized and undesirable “Other.” Their struggles not only contributed to shaping the United States and its immigration policies, but also confronted them with the conundrum of how to belong: to accept and seek admission into the existing system delineated by race, gender, and class, or to challenge the premises of that system. While there was not a singular response from this diverse community, the majority opted to fight for a place in “white” America even if in return this rendered them a liminal ethnicity.
Asian women, the immigrant generation, entered Hawai’i, when it was a kingdom and subsequently a US territory, and the Western US continent, from the 1840s to the 1930s as part of a global movement of people escaping imperial wars, colonialism, and homeland disorder. Most were wives or picture brides from China, Japan, Korea, the Philippines, and South Asia, joining menfolk who worked overseas to escape poverty and strife. Women also arrived independently; some on the East Coast. US immigration laws restricting the entry of Asian male laborers also limited Asian women.
Asian women were critical for establishing Asian American families and ensuring such households’ survival and social mobility. They worked on plantations, in agricultural fields and canneries, as domestics and seamstresses, and helped operate family businesses, while doing housework, raising children, and navigating cultural differences. Their activities gave women more power in their families than by tradition and shifted gender roles toward more egalitarian households. Women’s organizations, and women’s leadership, ideas, and skills contributed to ethnic community formation.
Second generation (US-born) Asian American women grew up in the late 19th and early 20th centuries and negotiated generational as well as cultural differences. Some were mixed race, namely, biracial or multiracial. Denied participation in many aspects of American youth culture, they formed ethnic-based clubs and organizations and held social activities that mirrored mainstream society. Some attended college. A few broke new ground professionally.
Asian and Asian American women were diverse in national origin, class, and location. Both generations faced race and gender boundaries in education, employment, and public spaces, and they were active in civic affairs to improve their lives and their communities’ well-being. Across America, they marched, made speeches, and raised funds to free their homelands from foreign occupation and fought for racial and gender equality in the courts, workplaces, and elsewhere.
Brandon R. Byrd
Black internationalism describes the political culture and intellectual practice forged in response to slavery, colonialism, and white imperialism. It is a historical and ongoing collective struggle against racial oppression rooted in global consciousness. While the expression of black internationalism has certainly changed across time and place, black liberation through collaboration has been and remains its ultimate goal.
Since the emergence of black internationalism as a result of the transatlantic slave trade and during the Age of Revolutions, black women such as the poet Phyllis Wheatley and evangelist Rebecca Protten have been at its forefront. Their writings and activism espoused an Afro-diasporic, global consciousness and promoted the cause of universal emancipation. During the 19th century, black women internationalists included abolitionists, missionaries, and clubwomen. They built on the work of their predecessors while laying the foundations for succeeding black women internationalists in the early 20th century. By World War I, a new generation of black women activists and intellectuals remained crucial parts of the International Council of Women, an organization founded by white suffragists from the United States, and the Universal Negro Improvement Association, a global organization formally led by Jamaican pan-Africanist Marcus Garvey. But they also formed an independent organization, the International Council of Women of the Darker Races (ICWDR).
Within and outside of the ICWDR, black women from Africa and the African Diaspora faced and challenged discrimination on the basis of their sex and race. Their activism and intellectual work set a powerful precedent for a subsequent wave of black internationalism shaped by self-avowed black feminists.
Ana Elizabeth Rosas
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.
The relationship between the car and the city remains complex and involves numerous private and public forces, innovations in technology, global economic fluctuations, and shifting cultural attitudes that only rarely consider the efficiency of the automobile as a long-term solution to urban transit. The advantages of privacy, speed, ease of access, and personal enjoyment that led many to first embrace the automobile were soon shared and accentuated by transit planners as the surest means to realize the long-held ideals of urban beautification, efficiency, and accessible suburbanization. The remarkable gains in productivity provided by industrial capitalism brought these dreams within reach and individual car ownership became the norm for most American families by the middle of the 20th century. Ironically, the success in creating such a “car country” produced the conditions that again congested traffic, raised questions about the quality of urban (and now suburban) living, and further distanced the nation from alternative transit options. The “hidden costs” of postwar automotive dependency in the United States became more apparent in the late 1960s, leading to federal legislation compelling manufacturers and transit professionals to address the long-standing inefficiencies of the car. This most recent phase coincides with a broader reappraisal of life in the city and a growing recognition of the material limits to mass automobility.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
Communist activists took a strong interest in American trade unions from the 1920s through the 1950s and played an important role in shaping the nature of the American union movement. Initial communist trade union activism drew upon radical labor traditions that preceded the formation of the American Communist Party (CPUSA). Early communist trade unionists experimented with different types of structures to organize unorganized workers. They also struggled with international communist factionalism. Communist trade unionists were most effective during the Great Depression and World War II. In those years, communist activists helped build the Congress of Industrial Organizations (CIO) and bring industrial unionism to previously unorganized workers. Throughout the history of communist involvement in the US labor movement, international communist policy guided general organizing strategies. Shifts in international policy, such as the announcement of a Soviet non-aggression pact with Germany, proved politically difficult to navigate on the local level. Yet, Left-led unions proved to be more democratically run and focused on racial and gender equality than many of those without communist influence. Their leadership supported social justice and militant action. The Cold War years witnessed CIO purges of Left-led unions and federal investigations and arrests of communist trade unionists. Repression from both within and without the labor movement as well as the CPUSA’s own internal policy battles ultimately ended communist trade unionists’ widespread influence on American trade unions.
James R. Barrett
The largest and most important revolutionary socialist organization in US history, the Communist Party USA was always a minority influence. It reached considerable size and influence, however, during the Great Depression and World War II years when it followed the more open line associated with the term “Popular Front.” In these years communists were much more flexible in their strategies and relations with other groups, though the party remained a hierarchical vanguard organization. It grew from a largely isolated sect dominated by unskilled and unemployed immigrant men in the 1920s to a socially diverse movement of nearly 100,000 based heavily on American born men and women from the working and professional classes by the late 1930s and during World War II, exerting considerable influence in the labor movement and American cultural life. In these years, the Communist Party helped to build the industrial union movement, advanced the cause of African American civil rights, and laid the foundation for the postwar feminist movement. But the party was always prone to abrupt changes in line and vulnerable to attack as a sinister outside force because of its close adherence to Soviet policies and goals. Several factors contributed to its catastrophic decline in the 1950s: the increasingly antagonistic Cold War struggle between the Soviet Union and the United States; an unprecedented attack from employers and government at various levels—criminal cases and imprisonment, deportation, and blacklisting; and within the party itself, a turn back toward a more dogmatic version of Marxism-Leninism and a heightened atmosphere of factional conflict and purges.
Distinctive patterns of daily life defined the Jim Crow South. Contrary to many observers’ emphasis on de jure segregation—meaning racial separation demanded by law—neither law nor the physical separation of blacks and whites was at the center of the early 20th-century South’s social system. Instead, separation, whether by law or custom, was one of multiple tools whites used to subordinate and exclude blacks and to maintain notions of white racial purity. In turn, these notions themselves varied over time and across jurisdictions, at least in their details, as elites tried repeatedly to establish who was “white,” who was “black,” and how the legal fictions they created would apply to Native Americans and others who fit neither category.
Within this complex multiracial world of the South, whites’ fundamental commitment to keeping blacks “in their place” manifested most routinely in day-to-day social dramas, often described in terms of racial “etiquette.” The black “place” in question was socially but not always physically distant from whites, and the increasing number of separate, racially marked spaces and actual Jim Crow laws was a development over time that became most pronounced in urban areas. It was a development that reveals blacks’ determination to resist racial oppression and whites’ perceived need to shore up a supposedly natural order that had, in fact, always been enforced by violence as well as political and economic power. Black resistance took many forms, from individual, covert acts of defiance to organized political movements. Whether in response to African Americans’ continued efforts to vote or their early 20th-century boycotts of segregated streetcars or World War I-era patterns of migration that threatened to deplete the agricultural labor force, whites found ways to counter blacks’ demands for equal citizenship and economic opportunity whenever and wherever they appeared.
In the rural South, where the majority of black Southerners remained economically dependent on white landowners, a “culture of personalism” characterized daily life within a paternalistic model of white supremacy that was markedly different from urban—and largely national, not merely southern—racial patterns. Thus, distinctions between rural and urban areas and issues of age and gender are critical to understanding the Jim Crow South. Although schools were rigorously segregated, preadolescent children could be allowed greater interracial intimacy in less official settings. Puberty became a break point after which close contact, especially between black males and white females, was prohibited. All told, Jim Crow was an inconsistent and uneven system of racial distinction and separation whose great reach shaped the South’s landscape and the lives of all Southerners, including those who were neither black nor white.