You are looking at 201-220 of 364 articles
It is virtually impossible to understand the history of the American experience without Protestantism. The theological and religious descendants of the Protestant Reformation arrived in the United States in the early 17th century, shaped American culture in the 18th century, grew dramatically in the 19th century, and continued to be the guardians of American religious life in the 20th century. Protestantism, of course, is not monolithic. In fact, the very idea at the heart of Protestantism—the translation of the Bible into vernacular languages so it can be read and interpreted by all men and women—has resulted in thousands of different denominations, all claiming to be true to the teachings of scripture.
Protestantism, with its emphasis on the belief that human beings can access God as individuals, flourished in a nation that celebrated democracy and freedom. During the period of British colonization, especially following the so-called Glorious Revolution of 1688, Protestantism went hand in hand with British concepts of political liberty. As the British people celebrated their rights-oriented philosophy of government and compared their freedoms with the tyranny of France and other absolute monarchies in Europe, they also extolled the religious freedom that they had to read and interpret the Bible for themselves. Following the American Revolution, this historic connection between political liberty and Protestant liberty proved to be compatible with the kind of democratic individualism that emerged in the decades preceding the Civil War and, in many respects, continues to define American political culture.
Protestantism, of course, is first and foremost a religious movement. The proliferation of Protestant denominations provides the best support for G. K. Chesterton’s quip that “America is a nation with the soul of a church.” Spiritual individualism, a commitment to the authority of an inspired Bible, and the idea that faith in the Christian gospel is all that is needed to be saved from eternal punishment, has transformed the lives of millions and millions of ordinary Americans over the course of the last four hundred years.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
D. Bradford Hunt
Public housing emerged during the New Deal as a progressive effort to end the scourge of dilapidated housing in American cities. Reformers argued that the private market had failed to provide decent, safe, and affordable housing, and they convinced Congress to provide deep subsidies to local housing authorities to build and manage modern, low-cost housing projects for the working poor. Well-intentioned but ultimately misguided policy decisions encouraged large-scale developments, concentrated poverty and youth, and starved public housing of needed resources. Further, the antipathy of private interests to public competition and the visceral resistance of white Americans to racial integration saddled public housing with many enemies and few friends. While residents often formed tight communities and fought for improvements, stigmatization and neglect undermined the success of many projects; a sizable fraction became disgraceful and tangible symbols of systemic racism toward the nation’s African American poor. Federal policy had few answers and retreated in the 1960s, eventually making a neoliberal turn to embrace public-private partnerships for delivering affordable housing. Housing vouchers and tax credits effectively displaced the federal public housing program. In the 1990s, the Clinton administration encouraged the demolition and rebuilding of troubled projects using vernacular “New Urbanist” designs to house “mixed-income” populations. Policy problems, political weakness, and an ideology of homeownership in the United States meant that a robust, public-centered program of housing for use rather than profit could not be sustained.
Adam M. Sowards
For more than a century after the republic’s founding in the 1780s, American law reflected the ideal that the commons—the public domain—should be turned into private property. As Americans became concerned about resource scarcity, waste, and monopolies at the end of the 19th century, reform-minded bureaucrats and scientists convinced Congress to maintain in perpetuity some of the nation’s land as public. This shift offered a measure of protection and an alternative to private property regimes. The federal agencies that primarily manage these lands today—U.S. Forest Service (USFS), National Park Service (NPS), U.S. Fish and Wildlife Service (USFWS), and Bureau of Land Management (BLM)—have worked since their origins in the early decades of the 20th century to fulfill their diverse, competing, evolving missions. Meanwhile, the public and Congress have continually demanded new and different goals as the land itself has functioned and responded in interdependent ways. In the mid-20th century, the agencies intensified their management, hoping they could satisfy the rising—and often conflicting—demands American citizens placed on the public lands. This intensification often worsened public lands’ ecology and increased political conflict, resulting in a series of new laws in the 1960s and 1970s. Those laws strengthened the role of science and the public in influencing agency practices while providing more opportunities for litigation. Predictably, since the late 1970s, these developments have polarized public lands’ politics. The economies, but also the identities, of many Americans remain entwined with the public lands, making political standoffs—over endangered species, oil production, privatizing land, and more—common and increasingly intractable. Because the public lands are national in scope but used by local people for all manner of economic and recreational activities, they have been and remain microcosms of the federal democratic system and all its conflicted nature.
Nicholas J. Cull
Public opinion has been part of US foreign relations in two key ways. As one would expect in a democracy, the American public has shaped the foreign policy of its government. No less significantly, the United States has sought to influence foreign public opinion as a tool of its diplomacy, now known as public diplomacy. The US public has also been a target of foreign attempts at influence with varying degrees of success. While analysis across the span of US history reveals a continuity of issues and approaches, issues of public opinion gained unprecedented salience in the second decade of the 21st century. This salience was not matched by scholarship.
Joseph E. Hower
Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments.
It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.
Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Puerto Rican migrants have resided in the United States since before the Spanish-Cuban-American War of 1898, when the United States took possession of the island of Puerto Rico as part of the Treaty of Paris. After the war, groups of Puerto Ricans began migrating to the United States as contract laborers, first to sugarcane plantations in Hawaii, and then to other destinations on the mainland. After the Jones Act of 1917 extended U.S. citizenship to islanders, Puerto Ricans migrated to the United States in larger numbers, establishing their largest base in New York City. Over the course of the 1920s and 1930s, a vibrant and heterogeneous colonia developed there, and Puerto Ricans participated actively both in local politics and in the increasingly contentious politics of their homeland, whose status was indeterminate until it became a commonwealth in 1952. The Puerto Rican community in New York changed dramatically after World War II, accommodating up to fifty thousand new migrants per year during the peak of the “great migration” from the island. Newcomers faced intense discrimination and marginalization in this era, defined by both a Cold War ethos and liberal social scientists’ interest in the “Puerto Rican problem.”
Puerto Rican migrant communities in the 1950s and 1960s—now rapidly expanding into the Midwest, especially Chicago, and into New Jersey, Connecticut, and Philadelphia—struggled with inadequate housing and discrimination in the job market. In local schools, Puerto Rican children often faced a lack of accommodation of their need for English language instruction. Most catastrophic for Puerto Rican communities, on the East Coast particularly, was the deindustrialization of the labor market over the course of the 1960s. By the late 1960s, in response to these conditions and spurred by the civil rights, Black Power, and other social movements, young Puerto Ricans began organizing and protesting in large numbers. Their activism combined a radical approach to community organizing with Puerto Rican nationalism and international anti-imperialism. The youth were not the only activists in this era. Parents in New York had initiated, together with their African American neighbors, a “community control” movement that spanned the late 1960s and early 1970s; and many other adult activists pushed the politics of the urban social service sector—the primary institutions in many impoverished Puerto Rican communities—further to the left.
By the mid-1970s, urban fiscal crises and the rising conservative backlash in national politics dealt another blow to many Puerto Rican communities in the United States. The Puerto Rican population as a whole was now widely considered part of a national “underclass,” and much of the political energy of Puerto Rican leaders focused on addressing the paucity of both basic material stability and social equality in their communities. Since the 1980s, however, Puerto Ricans have achieved some economic gains, and a growing college-educated middle class has managed to gain more control over the cultural representations of their communities. More recently, the political salience of Puerto Ricans as a group has begun to shift. For the better part of the 20th century, Puerto Ricans in the United States were considered numerically insignificant or politically impotent (or both); but in the last two presidential elections (2008 and 2012), their growing populations in the South, especially in Florida, have drawn attention to their demographic significance and their political sensibilities.
The Puritans were a group of people loosely defined through their shared adherence to the reformed theological tradition, largely following the work of John Calvin. Beginning in the 16th century, the Puritan movement took root in specific regional locales throughout Germany, Scotland, the Low Countries, and England. Following Queen Elizabeth’s settlement of 1559, which mandated conformity with the Church of England, the church’s authority splintered further as Protestants clashed with the episcopal polity, or church hierarchy. Religious conflict intensified from the 1580s through the end of James I’s reign, through repeated appeals to antiquity and patristics (writings from early Christian fathers) as pleas for further reform. Religious tension and persecution under the repressive regime of Archbishop Laud caused Puritans to leave England in search of new lands and communities.
When the Pilgrims and Puritans migrated to North America in 1620 and 1630, respectively, they did so with the intention of contesting the power of the crown to mandate religious uniformity. They believed in a Calvinist-based religion that espoused a separation of church and state, but that also privileged the spiritual authority of the individual to such a degree as to leave no clear signposts about how the disparate individuals practicing these faiths should form communities. Puritan congregations in New England allowed laymen as well as women new forms of spiritual self-discovery as they orally translated the evidence of grace recorded upon their souls into communal knowledge and a corporate identity that fashioned itself as a spiritual beacon to the world. Missionary encounters soon redefined Puritan faith, theology, and pious practices. Puritan identity in 17th century North America reconstituted itself through a particular confluence of interaction with foreign landscapes, native tribes, Africans, and new models of community and social interaction.
“Twenty and odd” Africans arrived in Virginia aboard a Dutch vessel in 1619 shortly after permanent colonization of the English Americas began. There has been significant academic debate about whether the enslavement of peoples of African descent in England’s early 17th-century colonies was an inevitable or “unthinking decision” and about the nature and degree of anti-black racism during the 17th century. The legal and social status of African peoples was more flexible at first in the English colonies than it later became. Some Africans managed to escape permanent enslavement and a few Africans, such as Anthony Johnson, even owned servants of their own. There was no legal basis for enslavement in the British Americas for the first several decades of settlement and slave and servant codes emerged only gradually. Labor systems operated by custom rather than through any legal mechanisms of coercion. Most workers in the Americas experienced degrees of coercion. In the earliest years of plantation production, peoples from Africa, Europe, and the Americas often toiled alongside each other in the fields. Large numbers of Native Americans were captured and forced to work on plantations in the English Americas and many whites worked in agricultural fields as indentured and convict laborers. There were a wide variety of different kinds of coerced labor beyond enslavement in the 17th century and ideas about racial difference had yet to become as determinative as they would later be. As the staple crop plantation system matured and became entrenched on the North American mainland in the late 17th and early 18th centuries and planters required a large and regular supply of slaves, African laborers became synonymous with large-scale plantation production. The permeable boundaries between slavery and freedom disappeared, dehumanizing racism became more entrenched and U.S.-based planters developed slave codes premised on racial distinctions and legal mechanisms of coercion that were modeled on Caribbean precedents.
Courtney Q. Shah
A concerted movement to promote sex education in America emerged in the early 20th century as part of a larger public health movement that also responded to the previous century’s concerns about venereal disease, prostitution, “seduction,” and “white slavery.” Sex education, therefore, offered a way to protect people (especially privileged women) from sexual activity of all kinds—consensual and coerced. A widespread introduction into public schools did not occur until after World War I. Sex education programs in schools tended to focus on training for heterosexual marriage at a time when high school attendance spiked in urban and suburban areas. Teachers often segregated male and female students.
Beyond teaching boys about male anatomy and girls about female anatomy, reformers and educators often conveyed different messages and used different materials, depending on the race of their students. Erratic desegregation efforts during the Civil Rights movement renewed a crisis in sex education programs. Parents and administrators considered sexuality education even more dangerous in the context of a racially integrated classroom. The backlash against sex education in the schools kept pace with the backlash against integration, with each often used to bolster the other. Opponents of integration and sex education, for example, often used racial language to scare parents about what kids were learning, and with whom.
In the 1980s and 1990s, the political power of the evangelical movement in the United States attracted support for “abstinence-only” curricula that relied on scare tactics and traditional assumptions about gender and sexuality. The ever-expanding acceptance (both legal and social) of lesbian, gay, bisexual, or transgender identity directly challenged the conservative turn of abstinence-until-marriage sex education programs. The politics of gender, race, class, and sexual orientation have consistently shaped and limited sex education.
Susanah Shaw Romney
On the mid-Atlantic coast between 1624 and 1664, the Dutch developed a successful and expansive colony, one that depended on particular interactions among women and men from American, European, and African backgrounds. Unlike some other colonial efforts, such as Jamestown, New Netherland had white women colonists from its inception. In contrast to Plymouth and other English settler colonies, a population of African men and women did the crucial work of establishing the colony’s initial infrastructure in its first years. What is more, a thriving cross-cultural trade between Netherlanders and Munsee, Mahican, and Mohawk residents of the region nurtured the development of the infant colony. Looking at the colony’s establishment and growth reveals that complex interactions among ethnically distinct families gave New Netherland its particular form and character. As European and African populations took root, many households engaged in the frontier trading economy, creating a web of connections reaching into multiple indigenous villages. Women and men cooperated to sustain this trade over long distances by relying on marriage and the economic unit of the household to organize production and exchange. In addition, the colonial government used these households to stake claims to the ground and to define Dutch jurisdiction, just as they recognized that residence by Indian or English households determined where Dutch power ended. Thus ethnic and gender relations shaped not only the colony’s internal hierarchies, but also its economy and its very boundaries.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Despite its cultivated reputation as the nation’s “white spot” in the early 20th century, Southern California was in fact home to diverse and numerous communities of color, some composed of relatively new immigrants and some long predating the era of Anglo settlement and conquest. In the years following World War II, the region engaged in suburban home construction on a mass scale and became a global symbol of what Dolores Hayden called the economically democratic but racially exclusive “sitcom suburb,” from the tax-lowering mechanism of its “Lakewood plan” to the car-friendly “Googie” architecture of the San Fernando Valley. Existing suburban communities of color, such as the colonias of agricultural laborers, were engulfed by new settlements, while upwardly mobile African Americans, Latinas/Latinos, and Asian Americans sought access to the expanding suburban dream of homeownership, with varying degrees of success. The political responses to suburban diversity in metropolitan Los Angeles ranged from Anglo resistance and flight to multiracial political coalitions and the incorporation of people of color at multiple levels of local government. The ascent by a number of suburbanites of color to positions of local and regional political power from the 1960s through the 1980s sometimes exposed intra-ethnic discord and sometimes the fragility of cross-race coalition as multiple actors sought to protect property values and to pursue economic security within the competitive constraints of shrinking municipal resources, aging infrastructure, and a receding suburban fringe. As a result, political conflicts over crime, immigration, education, and inequality emerged in many Los Angeles County suburbs by the 1970s and later in the more distant corporate suburbs of Orange, Ventura, Riverside, and San Bernardino Counties. The suburbanization of poverty, the role of suburbs as immigrant gateways, and the emergence of “majority-minority” suburbs—all national trends by the late 1990s and the first decade of the 20th century—were evident far earlier in the Los Angeles metropolitan region, where diverse suburbanites negotiated social and economic crises and innovated political responses.
Radicalism in the United States since 1945 has been varied, complex, and often fragmented, making it difficult to analyze as a coherent movement. Communist and pro-Soviet organizations remained active after World War II, but a proliferation of noncommunist groups in the 1940s and 1950s, formed by those disillusioned by Marxist theory or the Soviet Union, began to chart a new course for the American Left. Eschewing much of the previous focus on labor, the proletariat, and Marxist doctrine, American postwar radical organizations realigned around humanist values, moral action, democracy, and even religion, with tenuous connections to Marxism, if any. The parameters of postwar radical moral theory were not always clearly defined, and questions of strategy and vision caused frequent divisions among activists. Nonetheless, claims of individual dignity and freedom continued to frame left radicalism into the late 20th century, emphasizing identity politics, community-building initiatives, and cultural expression in the streets of U.S. cities and the halls of academia. The presidential campaign of Bernie Sanders in 2016 helped revitalize leftist rhetoric on the national stage with its calls for racial and economic equality on moral terms.
Since the early 1800s railroads have served as a critical element of the transportation infrastructure in the United States and have generated profound changes in technology, finance, business-government relations, and labor policy. By the 1850s railroads, at least in the northern states, had evolved into the nation’s first big businesses, replete with managerial hierarchies that in many respects resembled the structure of the US Army. After the Civil War ended, the railroad network grew rapidly, with lines extending into the Midwest and ultimately, with the completion of the first transcontinental railroad in 1869, to the Pacific Coast. The last third of the 19th century was characterized by increased militancy among railroad workers, as well as by the growing danger that railroading posed to employees and passengers. Intense competition among railroad companies led to rate wars and discriminatory pricing. The presence of rebates and long-haul/short-haul price differentials led to the federal regulation of the railroads in 1887. The Progressive Era generated additional regulation that reduced profitability and discouraged additional investment in the railroads. As a result, the carriers were often unprepared for the traffic demands associated with World War I, leading to government operation of the railroads between 1917 and 1920. Highway competition during the 1920s and the economic crises of the 1930s provided further challenges for the railroads. The nation’s railroads performed well during World War II but declined steadily in the years that followed. High labor costs, excessive regulatory oversight, and the loss of freight and passenger traffic to cars, trucks, and airplanes ensured that by the 1960s many once-profitable companies were on the verge of bankruptcy. A wave of mergers failed to halt the downward slide. The bankruptcy of Penn Central in 1970 increased public awareness of the dire circumstances and led to calls for regulatory reform. The 1980 Staggers Act abolished most of the restrictions on operations and pricing, thus revitalizing the railroads.
Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.
From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.
America’s tremendous diversities of faith, region, and ethnicity complicate efforts to generalize relationships between religious groups and the labor movement. Americans’ historic and widely shared commitment to Christianity masks deep divisions: between white Christians and black Christians, between Catholics and Protestants, between northern Protestants and southern Protestants, and between “modernist” Protestants (who view the Bible in metaphorical terms as a source of ethical guidance and emphasize social justice) and “fundamentalist” Protestants (who view the Bible literally and eschew social activism in favor of individual evangelizing). Work, class, and the role of the labor movement add extra dimensions to these complexities, which are multiplied when considering non-Christian traditions such as Judaism or the other world religious communities that have grown in the United States since the immigration reforms of 1965.
Nevertheless, scholars accept a general narrative that delineates key periods, themes, and players over the course of the twentieth century. From the turn of the 19th century until the 1930s, the relationship between religion and labor was shaped by the centrality of the American Federation of Labor (AFL) in the labor movement, the development of a “social gospel” among northern mainline Protestants, and the massive immigration from southern and eastern Europe that brought millions of Catholic and Jewish workers into the United States before it largely ended in the 1920s. These developments were sometimes in tension. The AFL favored craft unionism and placed a premium on organizing skilled male workers; it therefore left out many of the unskilled new arrivals (as well as African Americans and most women). Consequently, the shape of “religion and labor” formed primarily around the dynamic between the AFL and Protestant social reformers, without much regard to the large masses of unorganized Catholic, Jewish, and African American workers.
These dynamics shifted in the Great Depression. The Congress of Industrial Organizations (CIO), begun as a committee within the AFL in 1934, sought the organization of entire industries—skilled and unskilled alike, and ethnic Catholics and Jews became unionized in large numbers. Even traditional racial barriers in the labor movement began crumbling in some industries. And, the labor movement expanded its geographical ambition, pushing aggressively into the South. In turn, the religious voices associated with the labor movement broadened and deepened. Labor’s new alliances with Catholics, Jews, African Americans, and southern evangelicals helped to push the ranks of organized workers to historic highs in the 1950s.
This coalition has faced divisive, even disastrous headwinds since the 1960s. The strength of anticommunism, especially within religious groups, caused some religious workers to retreat from the reformist ambitions of the labor movement and sparked a conservative religious movement deeply opposed to labor and liberalism. Race became an ever-hotter flashpoint. Although religiously affiliated civil rights reformers often forged alliances with unions, the backlash and resistance to civil rights among portions of the white working class undermined the efficacy of labor unions as sources of social cohesion. Perhaps most profoundly, the economy as a whole transformed from an urban-industrial to a post-urban service model. Organized labor has floundered in the wake of these changes, and the concomitant resurgence of a traditionalist, individualistic, and therapeutic religious culture has offered the remains of the labor movement little to partner with.
Emily Suzanne Clark
Religion and race provide rich categories of analysis for American history. Neither category is stable. They change, shift, and develop in light of historical and cultural contexts. Religion has played a vital role in the construction, deconstruction, and transgression of racial identities and boundaries.
Race is a social concept and a means of classifying people. The “natural” and “inherent” differences between races are human constructs, social taxonomies created by cultures. In American history, the construction of racial identities and racial differences begins with the initial encounters between Europeans, Native Americans, and Africans. Access to and use of religious and political power has shaped how race has been conceived in American history. Racial categories and religious affiliations influenced how groups regarded each other throughout American history, with developments in the colonial period offering prime examples. Enslavement of Africans and their descendants, as well as conquered Native Americans, displayed the power of white Protestants. Even 19th-century American anti-Catholicism and anti-Mormonism intersected racial identifications. At the same time, just as religion has supported racial domination in American history, it also has inspired calls for self-determination among racial minorities, most notably in the 20th century.
With the long shadow of slavery, the power of white supremacy, the emphasis on Native sovereignty, and the civil rights movement, much of the story of religion and race in American history focuses on Americans white, black, and red. However, this is not the whole story. Mexican-Americans and Latinx immigrants bring Catholic and transnational connections, but their presence has prompted xenophobia. Additionally, white Americans sought to restrict the arrival of Asian immigrants both legally and culturally. With the passing of the Immigration and Nationality Act of 1965, the religious, racial, and ethnic diversity of the United States increased further. This religious and racial pluralism in many ways reflects the diversity of America, as does the conflict that comes with it.
The Great Depression of 1929–1941 brought not only economic and social crisis, but also forced families, churches, and religious organizations to reckon with individual and social suffering in ways that they had not done in the United States since the Civil War. This reckoning introduced a period of both theological and institutional transformation. Theologians wrestled not only with the domestic depression, but also with international instability as they faced questions about pacifism, economic and racial justice, and religious persecution. Ordinary people prayed for rain and revival. Many turned to their religious communities to wrestle together with the troubles they faced, or turned from those communities in disappointment and despair.
During the decades before the Great Depression, religious institutions across the United States had expanded their charitable efforts and their social reform campaigns, but the Depression wiped out the support for that work just as Americans needed it most. The New Deal brought a new set of questions about the relative roles of church and state in welfare and reform and introduced a period of religious ferment and church–state realignment. At the same time, the discontent and dislocation that the Great Depression wrought on local communities meant that individuals, families, and communities wrestled with deep theological questions together, often in ways that fractured old religious alliances and forged new ones. For American Jews and some Catholics, events in Europe proved even more troubling than those at home, and local communities reorganized around international activism and engagement.