21-40 of 532 Results

Article

Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality. Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture. The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability. Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.

Article

The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.

Article

American activists who challenged South African apartheid during the Cold War era extended their opposition to racial discrimination in the United States into world politics. US antiapartheid organizations worked in solidarity with forces struggling against the racist regime in South Africa and played a significant role in the global antiapartheid movement. More than four decades of organizing preceded the legislative showdown of 1986, when a bipartisan coalition in Congress overrode President Ronald Reagan’s veto, to enact economic sanctions against the apartheid regime in South Africa. Adoption of sanctions by the United States, along with transnational solidarity with the resistance to apartheid by South Africans, helped prompt the apartheid regime to relinquish power and allow the democratic elections that brought Nelson Mandela and the African National Congress to power in 1994. Drawing on the tactics, strategies and moral authority of the civil rights movement, antiapartheid campaigners mobilized public opinion while increasing African American influence in the formulation of US foreign policy. Long-lasting organizations such as the American Committee on Africa and TransAfrica called for boycotts and divestment while lobbying for economic sanctions. Utilizing tactics such as rallies, demonstrations, and nonviolent civil disobedience actions, antiapartheid activists made their voices heard on college campuses, corporate boardrooms, municipal and state governments, as well as the halls of Congress. Cultural expressions of criticism and resistance served to reinforce public sentiment against apartheid. Novels, plays, movies, and music provided a way for Americans to connect to the struggles of those suffering under apartheid. By extending the moral logic of the movement for African American civil rights, American anti-apartheid activists created a multicultural coalition that brought about institutional and governmental divestment from apartheid, prompted Congress to impose economic sanctions on South Africa, and increased the influence of African Americans regarding issues of race and American foreign policy.

Article

Radio debuted as a wireless alternative to telegraphy in the late 19th century. At its inception, wireless technology could only transmit signals and was incapable of broadcasting actual voices. During the 1920s, however, it transformed into a medium primarily identified as one used for entertainment and informational broadcasting. The commercialization of American broadcasting, which included the establishment of national networks and reliance on advertising to generate revenue, became the so-called American system of broadcasting. This transformation demonstrates how technology is shaped by the dynamic forces of the society in which it is embedded. Broadcasting’s aural attributes also engaged listeners in a way that distinguished it from other forms of mass media. Cognitive processes triggered by the disembodied voices and sounds emanating from radio’s loudspeakers illustrate how listeners, grounded in particular social, cultural, economic, and political contexts, made sense of and understood the content with which they were engaged. Through the 1940s, difficulties in expanding the international radio presence of the United States further highlight the significance of surrounding contexts in shaping the technology and in promoting (or discouraging) listener engagement with programing content.

Article

As places of dense habitation, cities have always required coordination and planning. City planning has involved the design and construction of large-scale infrastructure projects to provide basic necessities such as a water supply and drainage. By the 1850s, immigration and industrialization were fueling the rise of big cities, creating immense, collective problems of epidemics, slums, pollution, gridlock, and crime. From the 1850s to the 1900s, both local governments and utility companies responded to this explosive physical and demographic growth by constructing a “networked city” of modern technologies such as gaslight, telephones, and electricity. Building the urban environment also became a wellspring of innovation in science, medicine, and administration. In 1909–1910, a revolutionary idea—comprehensive city planning—opened a new era of professionalization and institutionalization in the planning departments of city halls and universities. Over the next thirty-five years, however, wars and depression limited their influence. From 1945 to 1965, in contrast, represents the golden age of formal planning. During this unprecedented period of peace and prosperity, academically trained experts played central roles in the modernization of the inner cities and the sprawl of the suburbs. But the planners’ clean-sweep approach to urban renewal and the massive destruction caused by highway construction provoked a revolt of the grassroots. Beginning in the Watts district of Los Angeles in 1965, mass uprisings escalated over the next three years into a national crisis of social disorder, racial and ethnic inequality, and environmental injustice. The postwar consensus of theory and practice was shattered, replaced by a fragmented profession ranging from defenders of top-down systems of computer-generated simulations to proponents of advocacy planning from the bottom up. Since the late 1980s, the ascendency of public-private partnerships in building the urban environment has favored the planners promoting systems approaches, who promise a future of high-tech “smart cities” under their complete control.

Article

The American War for Independence lasted eight years. It was one of the longest and bloodiest wars in America’s history, and yet it was not such a protracted conflict merely because the might of the British armed forces was brought to bear on the hapless colonials. The many divisions among Americans themselves over whether to fight, what to fight for, and who would do the fighting often had tragic and violent consequences. The Revolutionary War was by any measure the first American civil war. Yet national narratives of the Revolution and even much of the scholarship on the era focus more on simple stories of a contest between the Patriots and the British. Loyalists and other opponents of the Patriots are routinely left out of these narratives, or given short shrift. So, too, are the tens of thousands of ordinary colonists—perhaps a majority of the population—who were disaffected or alienated from either side or who tried to tack between the two main antagonists to make the best of a bad situation. Historians now estimate that as many as three-fifths of the colonial population were neither active Loyalists nor Patriots. When we take the war seriously and begin to think about narratives that capture the experience of the many, rather than the few, an illuminating picture emerges. The remarkably wide scope of the activities of the disaffected during the war—ranging from nonpayment of taxes to draft dodging and even to armed resistance to protect their neutrality—has to be integrated with older stories of militant Patriots and timid Loyalists. Only then can we understand the profound consequences of disaffection—particularly in creating divisions within the states, increasing levels of violence, prolonging the war, and changing the nature of the political settlements in each state. Indeed, the very divisions among diverse Americans that made the War for Independence so long, bitter, and bloody also explains much of the Revolutionary energy of the period. Though it is not as seamless as traditional narratives of the Revolution would suggest, a more complicated story also helps better explain the many problems the new states and eventually the new nation would face. In making this argument, we may finally suggest ways we can overcome what John Shy long ago noted as the tendency of scholars to separate the ‘destructive’ War for Independence from the ‘constructive’ political Revolution.

Article

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.” Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one. The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.

Article

The foreign relations of the Jacksonian age reflected Andrew Jackson’s own sense of the American “nation” as long victimized by non-white enemies and weak politicians. His goal as president from 1829 to 1837 was to restore white Americans’ “sovereignty,” to empower them against other nations both within and beyond US territory. Three priorities emerged from this conviction. First, Jackson was determined to deport the roughly 50,000 Creeks, Cherokees, Choctaws, Chickasaws, and Seminoles living in southern states and territories. He saw them as hostile nations who threatened American safety and checked American prosperity. Far from a domestic issue, Indian Removal was an imperial project that set the stage for later expansion over continental and oceanic frontiers. Second and somewhat paradoxically, Jackson sought better relations with Great Britain. These were necessary because the British Empire was both the main threat to US expansion and the biggest market for slave-grown exports from former Indian lands. Anglo-American détente changed investment patterns and economic development throughout the Western Hemisphere, encouraging American leaders to appease London even when patriotic passions argued otherwise. Third, Jackson wanted to open markets and secure property rights around the globe, by treaty if possible but by force when necessary. He called for a larger navy, pressed countries from France to Mexico for outstanding debts, and embraced retaliatory strikes on “savages” and “pirates” as far away as Sumatra. Indeed, the Jacksonian age brought a new American presence in the Pacific. By the mid-1840s the United States was the dominant power in the Hawaiian Islands and a growing force in China. The Mexican War that followed made the Union a two-ocean colossus—and pushed its regional tensions to the breaking point.

Article

Perhaps no other American leader has experienced so precipitous a fall from grace as Andrew Johnson, seventeenth president of the United States (1865–1869). During the Civil War, Johnson was the preeminent symbol of Southern Unionism—and thus, the ideal running mate for Abraham Lincoln in the 1864 election, on the Union Party ticket. Four years later, as president, he was widely viewed as a traitor to his political allies and even to the Union, and barely escaped impeachment. In modern scholarship, the image persists of Johnson as politically inept, and willfully self-destructive—driven by visceral emotions, particularly by implacable racism, and lacking in Lincoln’s skill for reading and molding public opinion. But such an image fails to capture fully the scope of his political influence. As President, Johnson staked claims that shaped the course of Reconstruction—that emancipation signified nothing but freedom; that the immediate aftermath of the Civil War was a golden moment of reconciliation, which Radicals squandered by pushing for black suffrage; that the Congressional program of Reconstruction inaugurated a period of so-called “black rule,” during which former Confederates were victimized and disfranchised. Such propaganda was designed to deem the Republican experiment in black citizenship a failure before it even began. Johnson’s term offers an example as striking as any in U.S. history of the power of presidents to set the terms of political debates—and of the power of their words to do lasting harm.

Article

Judy Yung and Erika Lee

The Angel Island Immigration Station (1910–1940), located in San Francisco Bay, was one of twenty-four ports of entry established by the U.S. government to process and detain immigrants entering and leaving the country. Although popularly called the “Ellis Island of the West,” the Angel Island station was in fact quite different from its counterpart in New York. Ellis Island was built in 1892 to welcome European immigrants and to enforce immigration laws that restricted but did not exclude European immigrants. In contrast, as the primary gateway for Chinese and other Asian immigrants, the Angel Island station was built in 1910 to better enforce discriminatory immigration policies that targeted Asians for exclusion. Chinese immigrants, in particular, were subjected to longer physical exams, interrogations, and detentions than any other immigrant group. Out of frustration, anger, and despair, many of them wrote and carved Chinese poems into the barrack walls. In 1940, a fire destroyed the administration building, and the immigration station was moved back to San Francisco. In 1963, the abandoned site became part of the state park system, and the remaining buildings were slated for demolition. Thanks to the collective efforts of Asian American activists and descendents of former detainees, the U.S. Immigration Station at Angel Island was designated a National Historic Landmark in 1997, and the immigration site, including the Chinese poetry on the barrack walls, was preserved and transformed into a museum of Pacific immigration for visitors.

Article

Episcopalians have built, reimagined, and rebuilt their church at least three different times over the course of 400 years in America. From scattered colonial beginnings, where laity both took major roles in running Church of England parishes and practiced a faith that was focused on worship, pastoral care, and good works, Anglicans created a church that blended hierarchy, democracy, and autonomy. It took time after the disruptions of the American Revolution for Episcopalians to find their place among the many competing denominations of the new nation. In the process women found new roles for themselves. Episcopalians continued to have a large impact on American society even as other denominations outpaced them in membership. As individuals they shaped American culture and became prominent advocates for the social gospel. Distracted at times as they tried to balance catholic and Protestant in their thought and worship, they built a church that included both religious orders and revival gatherings. Although perceived as a church of the elite, its members included African Americans, Asians, Native Americans, and union members. Episcopalians struggled with issues of race, class, and gender throughout their history. After World War II, their understandings of the teachings of Jesus pulled a majority of Episcopalians toward more liberal social positions and created a traditionalist revolt eventually resulting in a schism that required new rebuilding efforts in parts of America.

Article

Anna May Wong (January 3, 1905–February 3, 1961) was the first Chinese American movie star and the first Asian American actress to gain international recognition. Wong broke the codes of yellowface in both American and European cinema to become one of the major global actresses of Asian descent between the world wars. She made close to sixty films that circulated around the world and in 1951 starred in her own television show, The Gallery of Madame Liu-Tsong, produced by the defunct Dumont Network. Examining Wong’s career is particularly fruitful because of race’s centrality to the motion pictures’ construction of the modern American nation-state, as well as its significance within the global circulation of moving images. Born near Los Angeles’s Chinatown, Wong began acting in films at an early age. During the silent era, she starred in films such as The Toll of the Sea (1922), one of the first two-tone Technicolor films, and The Thief of Baghdad (1924). Frustrated by Hollywood roles, Wong left for Europe in the late 1920s, where she starred in several films and plays, including Piccadilly (1929) and A Circle of Chalk (1929) opposite Laurence Olivier. Wong traveled between the United States and Europe for film and stage work. In 1935 she protested Metro-Goldwyn-Mayer’s refusal to consider her for the leading role of O-Lan in the Academy Award–winning film The Good Earth (1937). Wong then paid her one and only visit to China. In the late 1930s, she starred in several B films such as King of Chinatown (1939), graced the cover of the mass-circulating American magazine Look, and traveled to Australia. In 1961, Wong died of Laennec’s cirrhosis, a disease typically stemming from alcoholism. Yet, as her legacy shows, for a brief moment a glamorous Chinese American woman occupied a position of transnational importance.

Article

Lynn Westerkamp

Anne Hutchinson engaged a diverse group of powerful men as well as the disenfranchised during the mid-1630s in Boston’s so-called Antinomian Controversy, the name given to the theological battle between John Cotton, who emphasized free grace, and other clerics who focused upon preparation for those seeking salvation. Hutchinson followed Cotton’s position, presented his theology in meetings in her home, and inspired her followers, male and female, to reject pastors opposing Cotton’s position. Hutchinson’s followers included leading men who opposed John Winthrop’s leadership of Massachusetts Bay Colony; this dispute also became an arena where Winthrop reasserted his power. Hutchinson represents the Puritans’ drive for spiritual development within, including her claim of revelation. She is best understood within a transatlantic framework illustrating both the tools of patriarchal oppression and, more importantly, the appeal of Puritan spirituality for women.

Article

Joshua L. Rosenbloom

The United States economy underwent major transformations between American independence and the Civil War through rapid population growth, the development of manufacturing, the onset of modern economic growth, increasing urbanization, the rapid spread of settlement into the trans-Appalachian west, and the rise of European immigration. These decades were also characterized by an increasing sectional conflict between free and slave states that culminated in 1861 in Southern secession from the Union and a bloody and destructive Civil War. Labor markets were central to each of these developments, directing the reallocation of labor between sectors and regions, channeling a growing population into productive employment, and shaping the growing North–South division within the country. Put differently, labor markets influenced the pace and character of economic development in the antebellum United States. On the one hand, the responsiveness of labor markets to economic shocks helped promote economic growth; on the other, imperfections in labor market responses to these shocks significantly affected the character and development of the national economy.

Article

Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.

Article

Mark S. Massa S. J.

Historian John Higham once referred to anti-Catholicism as “by far the oldest, and the most powerful of anti-foreign traditions” in North American intellectual and cultural history. But Higham’s famous observation actually elided three different types of anti-Catholic nativism that have enjoyed a long and quite vibrant life in North America: a cultural distrust of Catholics, based on an understanding of North American public culture rooted in a profoundly British and Protestant ordering of human society; an intellectual distrust of Catholics, based on a set of epistemological and philosophical ideas first elucidated in the English (Lockean) and Scottish (“Common Sense Realist”) Enlightenments and the British Whig tradition of political thought; and a nativist distrust of Catholics as deviant members of American society, a perception central to the Protestant mainstream’s duty of “boundary maintenance” (to utilize Emile Durkheim’s reading of how “outsiders” help “insiders” maintain social control). An examination of the long history of anti-Catholicism in the United States can be divided into three parts: first, an overview of the types of anti-Catholic animus utilizing the typology adumbrated above; second, a narrative history of the most important anti-Catholic events in U.S. culture (e.g., Harvard’s Dudleian Lectures, the Suffolk Resolves, the burning of the Charlestown convent, Maria Monk’s Awful Disclosures); and finally, a discussion of American Catholic efforts to address the animus.

Article

Long regarded as a violent outburst significant mainly for California history, the 1871 Los Angeles anti-Chinese massacre raises themes central to America’s Civil War Reconstruction era between 1865 and 1877, namely, the resort to threats and violence to preserve traditionally conceived social and political authority and power. Although the Los Angeles events occurred far from the American South, the Los Angeles anti-Chinese massacre paralleled the anti-black violence that rose in the South during Reconstruction. Although the immediate causes of the violence in the post–Civil War South and California were far different, they shared one key characteristic: they employed racial disciplining to preserve traditional social orders that old elites saw as threatened by changing times and circumstances.

Article

Robert David Johnson

The birth of the United States through a successful colonial revolution created a unique nation-state in which anti-imperialist sentiment existed from the nation’s founding. Three broad points are essential in understanding the relationship between anti-imperialism and U.S. foreign relations. First, the United States obviously has had more than its share of imperialist ventures over the course of its history. Perhaps the better way to address the matter is to remark on—at least in comparison to other major powers—how intense a commitment to anti-imperialism has remained among some quarters of the American public and government. Second, the strength of anti-imperialist sentiment has varied widely and often has depended upon domestic developments, such as the emergence of abolitionism before the Civil War or the changing nature of the Progressive movement following World War I. Third, anti-imperialist policy alternatives have enjoyed considerably more support in Congress than in the executive branch.

Article

Antimonopoly, meaning opposition to the exclusive or near-exclusive control of an industry or business by one or a very few businesses, played a relatively muted role in the history of the post-1945 era, certainly compared to some earlier periods in American history. However, the subject of antimonopoly is important because it sheds light on changing attitudes toward concentrated power, corporations, and the federal government in the United States after World War II. Paradoxically, as antimonopoly declined as a grass-roots force in American politics, the technical, expert-driven field of antitrust enjoyed a golden age. From the 1940s to the 1960s, antitrust operated on principles that were broadly in line with those that inspired its creation in the late 19th and early 20th century, acknowledging the special contribution small-business owners made to US democratic culture. In these years, antimonopoly remained sufficiently potent as a political force to sustain the careers of national-level politicians such as congressmen Wright Patman and Estes Kefauver and to inform the opinions of Supreme Court justices such as Hugo Black and William O. Douglas. Antimonopoly and consumer politics overlapped in this period. From the mid-1960s onward, Ralph Nader repeatedly tapped antimonopoly ideas in his writings and consumer activism, skillfully exploiting popular anxieties about concentrated economic power. At the same time, as part of the United States’ rise to global hegemony, officials in the federal government’s Antitrust Division exported antitrust overseas, building it into the political, economic, and legal architecture of the postwar world. Beginning in the 1940s, conservative lawyers and economists launched a counterattack against the conception of antitrust elaborated in the progressive era. By making consumer welfare—understood in terms of low prices and market efficiency—the determining factor in antitrust cases, they made a major intellectual and political contribution to the rightward thrust of US politics in the 1970s and 1980s. Robert Bork’s The Antitrust Paradox, published in 1978, popularized and signaled the ascendency of this new approach. In the 1980s and 1990s antimonopoly drifted to the margin of political debate. Fear of big government now loomed larger in US politics than the specter of monopoly or of corporate domination. In the late 20th century, Americans, more often than not, directed their antipathy toward concentrated power in its public, rather than its private, forms. This fundamental shift in the political landscape accounts in large part for the overall decline of antimonopoly—a venerable American political tradition—in the period 1945 to 2000.

Article

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.