You are looking at 201-220 of 353 articles
Jessica Ellen Sewell
From 1800 to 2000, cities grew enormously, and saw an expansion of public spaces to serve the varied needs of a diverse population living in ever more cramped and urban circumstances. While a wide range of commercial semipublic spaces became common in the late 19th century, parks and streets were the best examples of truly public spaces with full freedom of access. Changes in the design and management of streets, sidewalks, squares, parks, and plazas during this period reflect changing ideas about the purpose of public space and how it should be used.
Streets shifted from being used for a wide range of activities, including vending, playing games, and storing goods, to becoming increasingly specialized spaces of movement, designed and managed by the early twentieth century for automobile traffic. Sidewalks, which in the early nineteenth century were paid for and liberally used by adjacent businesses, were similarly specialized as spaces of pedestrian movement. However, the tradition of using streets and sidewalks as a space of public celebration and public speech remained strong throughout the period. During parades and protests, streets and sidewalks were temporarily remade as spaces of the performance of the public, and the daily activities of circulation and commerce were set aside.
In 1800, the main open public spaces in cities were public squares or commons, often used for militia training and public celebration. In the second half of the 19th century, these were augmented by large picturesque parks. Designed as an antidote to urbanity, these parks served the public as a place for leisure, redefining public space as a polite leisure amenity, rather than a place for people to congregate as a public. The addition of playgrounds, recreational spaces, and public plazas in the 20th century served both the physical and mental health of the public. In the late 20th century, responding to neoliberal ideas and urban fiscal crises, the ownership and management of public parks and plazas was increasingly privatized, further challenging public accessibility.
Puerto Rican migrants have resided in the United States since before the Spanish-Cuban-American War of 1898, when the United States took possession of the island of Puerto Rico as part of the Treaty of Paris. After the war, groups of Puerto Ricans began migrating to the United States as contract laborers, first to sugarcane plantations in Hawaii, and then to other destinations on the mainland. After the Jones Act of 1917 extended U.S. citizenship to islanders, Puerto Ricans migrated to the United States in larger numbers, establishing their largest base in New York City. Over the course of the 1920s and 1930s, a vibrant and heterogeneous colonia developed there, and Puerto Ricans participated actively both in local politics and in the increasingly contentious politics of their homeland, whose status was indeterminate until it became a commonwealth in 1952. The Puerto Rican community in New York changed dramatically after World War II, accommodating up to fifty thousand new migrants per year during the peak of the “great migration” from the island. Newcomers faced intense discrimination and marginalization in this era, defined by both a Cold War ethos and liberal social scientists’ interest in the “Puerto Rican problem.”
Puerto Rican migrant communities in the 1950s and 1960s—now rapidly expanding into the Midwest, especially Chicago, and into New Jersey, Connecticut, and Philadelphia—struggled with inadequate housing and discrimination in the job market. In local schools, Puerto Rican children often faced a lack of accommodation of their need for English language instruction. Most catastrophic for Puerto Rican communities, on the East Coast particularly, was the deindustrialization of the labor market over the course of the 1960s. By the late 1960s, in response to these conditions and spurred by the civil rights, Black Power, and other social movements, young Puerto Ricans began organizing and protesting in large numbers. Their activism combined a radical approach to community organizing with Puerto Rican nationalism and international anti-imperialism. The youth were not the only activists in this era. Parents in New York had initiated, together with their African American neighbors, a “community control” movement that spanned the late 1960s and early 1970s; and many other adult activists pushed the politics of the urban social service sector—the primary institutions in many impoverished Puerto Rican communities—further to the left.
By the mid-1970s, urban fiscal crises and the rising conservative backlash in national politics dealt another blow to many Puerto Rican communities in the United States. The Puerto Rican population as a whole was now widely considered part of a national “underclass,” and much of the political energy of Puerto Rican leaders focused on addressing the paucity of both basic material stability and social equality in their communities. Since the 1980s, however, Puerto Ricans have achieved some economic gains, and a growing college-educated middle class has managed to gain more control over the cultural representations of their communities. More recently, the political salience of Puerto Ricans as a group has begun to shift. For the better part of the 20th century, Puerto Ricans in the United States were considered numerically insignificant or politically impotent (or both); but in the last two presidential elections (2008 and 2012), their growing populations in the South, especially in Florida, have drawn attention to their demographic significance and their political sensibilities.
The Puritans were a group of people loosely defined through their shared adherence to the reformed theological tradition, largely following the work of John Calvin. Beginning in the 16th century, the Puritan movement took root in specific regional locales throughout Germany, Scotland, the Low Countries, and England. Following Queen Elizabeth’s settlement of 1559, which mandated conformity with the Church of England, the church’s authority splintered further as Protestants clashed with the episcopal polity, or church hierarchy. Religious conflict intensified from the 1580s through the end of James I’s reign, through repeated appeals to antiquity and patristics (writings from early Christian fathers) as pleas for further reform. Religious tension and persecution under the repressive regime of Archbishop Laud caused Puritans to leave England in search of new lands and communities.
When the Pilgrims and Puritans migrated to North America in 1620 and 1630, respectively, they did so with the intention of contesting the power of the crown to mandate religious uniformity. They believed in a Calvinist-based religion that espoused a separation of church and state, but that also privileged the spiritual authority of the individual to such a degree as to leave no clear signposts about how the disparate individuals practicing these faiths should form communities. Puritan congregations in New England allowed laymen as well as women new forms of spiritual self-discovery as they orally translated the evidence of grace recorded upon their souls into communal knowledge and a corporate identity that fashioned itself as a spiritual beacon to the world. Missionary encounters soon redefined Puritan faith, theology, and pious practices. Puritan identity in 17th century North America reconstituted itself through a particular confluence of interaction with foreign landscapes, native tribes, Africans, and new models of community and social interaction.
“Twenty and odd” Africans arrived in Virginia aboard a Dutch vessel in 1619 shortly after permanent colonization of the English Americas began. There has been significant academic debate about whether the enslavement of peoples of African descent in England’s early 17th-century colonies was an inevitable or “unthinking decision” and about the nature and degree of anti-black racism during the 17th century. The legal and social status of African peoples was more flexible at first in the English colonies than it later became. Some Africans managed to escape permanent enslavement and a few Africans, such as Anthony Johnson, even owned servants of their own. There was no legal basis for enslavement in the British Americas for the first several decades of settlement and slave and servant codes emerged only gradually. Labor systems operated by custom rather than through any legal mechanisms of coercion. Most workers in the Americas experienced degrees of coercion. In the earliest years of plantation production, peoples from Africa, Europe, and the Americas often toiled alongside each other in the fields. Large numbers of Native Americans were captured and forced to work on plantations in the English Americas and many whites worked in agricultural fields as indentured and convict laborers. There were a wide variety of different kinds of coerced labor beyond enslavement in the 17th century and ideas about racial difference had yet to become as determinative as they would later be. As the staple crop plantation system matured and became entrenched on the North American mainland in the late 17th and early 18th centuries and planters required a large and regular supply of slaves, African laborers became synonymous with large-scale plantation production. The permeable boundaries between slavery and freedom disappeared, dehumanizing racism became more entrenched and U.S.-based planters developed slave codes premised on racial distinctions and legal mechanisms of coercion that were modeled on Caribbean precedents.
Courtney Q. Shah
A concerted movement to promote sex education in America emerged in the early 20th century as part of a larger public health movement that also responded to the previous century’s concerns about venereal disease, prostitution, “seduction,” and “white slavery.” Sex education, therefore, offered a way to protect people (especially privileged women) from sexual activity of all kinds—consensual and coerced. A widespread introduction into public schools did not occur until after World War I. Sex education programs in schools tended to focus on training for heterosexual marriage at a time when high school attendance spiked in urban and suburban areas. Teachers often segregated male and female students.
Beyond teaching boys about male anatomy and girls about female anatomy, reformers and educators often conveyed different messages and used different materials, depending on the race of their students. Erratic desegregation efforts during the Civil Rights movement renewed a crisis in sex education programs. Parents and administrators considered sexuality education even more dangerous in the context of a racially integrated classroom. The backlash against sex education in the schools kept pace with the backlash against integration, with each often used to bolster the other. Opponents of integration and sex education, for example, often used racial language to scare parents about what kids were learning, and with whom.
In the 1980s and 1990s, the political power of the evangelical movement in the United States attracted support for “abstinence-only” curricula that relied on scare tactics and traditional assumptions about gender and sexuality. The ever-expanding acceptance (both legal and social) of lesbian, gay, bisexual, or transgender identity directly challenged the conservative turn of abstinence-until-marriage sex education programs. The politics of gender, race, class, and sexual orientation have consistently shaped and limited sex education.
Susanah Shaw Romney
On the mid-Atlantic coast between 1624 and 1664, the Dutch developed a successful and expansive colony, one that depended on particular interactions among women and men from American, European, and African backgrounds. Unlike some other colonial efforts, such as Jamestown, New Netherland had white women colonists from its inception. In contrast to Plymouth and other English settler colonies, a population of African men and women did the crucial work of establishing the colony’s initial infrastructure in its first years. What is more, a thriving cross-cultural trade between Netherlanders and Munsee, Mahican, and Mohawk residents of the region nurtured the development of the infant colony. Looking at the colony’s establishment and growth reveals that complex interactions among ethnically distinct families gave New Netherland its particular form and character. As European and African populations took root, many households engaged in the frontier trading economy, creating a web of connections reaching into multiple indigenous villages. Women and men cooperated to sustain this trade over long distances by relying on marriage and the economic unit of the household to organize production and exchange. In addition, the colonial government used these households to stake claims to the ground and to define Dutch jurisdiction, just as they recognized that residence by Indian or English households determined where Dutch power ended. Thus ethnic and gender relations shaped not only the colony’s internal hierarchies, but also its economy and its very boundaries.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Despite its cultivated reputation as the nation’s “white spot” in the early 20th century, Southern California was in fact home to diverse and numerous communities of color, some composed of relatively new immigrants and some long predating the era of Anglo settlement and conquest. In the years following World War II, the region engaged in suburban home construction on a mass scale and became a global symbol of what Dolores Hayden called the economically democratic but racially exclusive “sitcom suburb,” from the tax-lowering mechanism of its “Lakewood plan” to the car-friendly “Googie” architecture of the San Fernando Valley. Existing suburban communities of color, such as the colonias of agricultural laborers, were engulfed by new settlements, while upwardly mobile African Americans, Latinas/Latinos, and Asian Americans sought access to the expanding suburban dream of homeownership, with varying degrees of success. The political responses to suburban diversity in metropolitan Los Angeles ranged from Anglo resistance and flight to multiracial political coalitions and the incorporation of people of color at multiple levels of local government. The ascent by a number of suburbanites of color to positions of local and regional political power from the 1960s through the 1980s sometimes exposed intra-ethnic discord and sometimes the fragility of cross-race coalition as multiple actors sought to protect property values and to pursue economic security within the competitive constraints of shrinking municipal resources, aging infrastructure, and a receding suburban fringe. As a result, political conflicts over crime, immigration, education, and inequality emerged in many Los Angeles County suburbs by the 1970s and later in the more distant corporate suburbs of Orange, Ventura, Riverside, and San Bernardino Counties. The suburbanization of poverty, the role of suburbs as immigrant gateways, and the emergence of “majority-minority” suburbs—all national trends by the late 1990s and the first decade of the 20th century—were evident far earlier in the Los Angeles metropolitan region, where diverse suburbanites negotiated social and economic crises and innovated political responses.
Radicalism in the United States since 1945 has been varied, complex, and often fragmented, making it difficult to analyze as a coherent movement. Communist and pro-Soviet organizations remained active after World War II, but a proliferation of noncommunist groups in the 1940s and 1950s, formed by those disillusioned by Marxist theory or the Soviet Union, began to chart a new course for the American Left. Eschewing much of the previous focus on labor, the proletariat, and Marxist doctrine, American postwar radical organizations realigned around humanist values, moral action, democracy, and even religion, with tenuous connections to Marxism, if any. The parameters of postwar radical moral theory were not always clearly defined, and questions of strategy and vision caused frequent divisions among activists. Nonetheless, claims of individual dignity and freedom continued to frame left radicalism into the late 20th century, emphasizing identity politics, community-building initiatives, and cultural expression in the streets of U.S. cities and the halls of academia. The presidential campaign of Bernie Sanders in 2016 helped revitalize leftist rhetoric on the national stage with its calls for racial and economic equality on moral terms.
Since the early 1800s railroads have served as a critical element of the transportation infrastructure in the United States and have generated profound changes in technology, finance, business-government relations, and labor policy. By the 1850s railroads, at least in the northern states, had evolved into the nation’s first big businesses, replete with managerial hierarchies that in many respects resembled the structure of the US Army. After the Civil War ended, the railroad network grew rapidly, with lines extending into the Midwest and ultimately, with the completion of the first transcontinental railroad in 1869, to the Pacific Coast. The last third of the 19th century was characterized by increased militancy among railroad workers, as well as by the growing danger that railroading posed to employees and passengers. Intense competition among railroad companies led to rate wars and discriminatory pricing. The presence of rebates and long-haul/short-haul price differentials led to the federal regulation of the railroads in 1887. The Progressive Era generated additional regulation that reduced profitability and discouraged additional investment in the railroads. As a result, the carriers were often unprepared for the traffic demands associated with World War I, leading to government operation of the railroads between 1917 and 1920. Highway competition during the 1920s and the economic crises of the 1930s provided further challenges for the railroads. The nation’s railroads performed well during World War II but declined steadily in the years that followed. High labor costs, excessive regulatory oversight, and the loss of freight and passenger traffic to cars, trucks, and airplanes ensured that by the 1960s many once-profitable companies were on the verge of bankruptcy. A wave of mergers failed to halt the downward slide. The bankruptcy of Penn Central in 1970 increased public awareness of the dire circumstances and led to calls for regulatory reform. The 1980 Staggers Act abolished most of the restrictions on operations and pricing, thus revitalizing the railroads.
Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.
From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.
America’s tremendous diversities of faith, region, and ethnicity complicate efforts to generalize relationships between religious groups and the labor movement. Americans’ historic and widely shared commitment to Christianity masks deep divisions: between white Christians and black Christians, between Catholics and Protestants, between northern Protestants and southern Protestants, and between “modernist” Protestants (who view the Bible in metaphorical terms as a source of ethical guidance and emphasize social justice) and “fundamentalist” Protestants (who view the Bible literally and eschew social activism in favor of individual evangelizing). Work, class, and the role of the labor movement add extra dimensions to these complexities, which are multiplied when considering non-Christian traditions such as Judaism or the other world religious communities that have grown in the United States since the immigration reforms of 1965.
Nevertheless, scholars accept a general narrative that delineates key periods, themes, and players over the course of the twentieth century. From the turn of the 19th century until the 1930s, the relationship between religion and labor was shaped by the centrality of the American Federation of Labor (AFL) in the labor movement, the development of a “social gospel” among northern mainline Protestants, and the massive immigration from southern and eastern Europe that brought millions of Catholic and Jewish workers into the United States before it largely ended in the 1920s. These developments were sometimes in tension. The AFL favored craft unionism and placed a premium on organizing skilled male workers; it therefore left out many of the unskilled new arrivals (as well as African Americans and most women). Consequently, the shape of “religion and labor” formed primarily around the dynamic between the AFL and Protestant social reformers, without much regard to the large masses of unorganized Catholic, Jewish, and African American workers.
These dynamics shifted in the Great Depression. The Congress of Industrial Organizations (CIO), begun as a committee within the AFL in 1934, sought the organization of entire industries—skilled and unskilled alike, and ethnic Catholics and Jews became unionized in large numbers. Even traditional racial barriers in the labor movement began crumbling in some industries. And, the labor movement expanded its geographical ambition, pushing aggressively into the South. In turn, the religious voices associated with the labor movement broadened and deepened. Labor’s new alliances with Catholics, Jews, African Americans, and southern evangelicals helped to push the ranks of organized workers to historic highs in the 1950s.
This coalition has faced divisive, even disastrous headwinds since the 1960s. The strength of anticommunism, especially within religious groups, caused some religious workers to retreat from the reformist ambitions of the labor movement and sparked a conservative religious movement deeply opposed to labor and liberalism. Race became an ever-hotter flashpoint. Although religiously affiliated civil rights reformers often forged alliances with unions, the backlash and resistance to civil rights among portions of the white working class undermined the efficacy of labor unions as sources of social cohesion. Perhaps most profoundly, the economy as a whole transformed from an urban-industrial to a post-urban service model. Organized labor has floundered in the wake of these changes, and the concomitant resurgence of a traditionalist, individualistic, and therapeutic religious culture has offered the remains of the labor movement little to partner with.
Emily Suzanne Clark
Religion and race provide rich categories of analysis for American history. Neither category is stable. They change, shift, and develop in light of historical and cultural contexts. Religion has played a vital role in the construction, deconstruction, and transgression of racial identities and boundaries.
Race is a social concept and a means of classifying people. The “natural” and “inherent” differences between races are human constructs, social taxonomies created by cultures. In American history, the construction of racial identities and racial differences begins with the initial encounters between Europeans, Native Americans, and Africans. Access to and use of religious and political power has shaped how race has been conceived in American history. Racial categories and religious affiliations influenced how groups regarded each other throughout American history, with developments in the colonial period offering prime examples. Enslavement of Africans and their descendants, as well as conquered Native Americans, displayed the power of white Protestants. Even 19th-century American anti-Catholicism and anti-Mormonism intersected racial identifications. At the same time, just as religion has supported racial domination in American history, it also has inspired calls for self-determination among racial minorities, most notably in the 20th century.
With the long shadow of slavery, the power of white supremacy, the emphasis on Native sovereignty, and the civil rights movement, much of the story of religion and race in American history focuses on Americans white, black, and red. However, this is not the whole story. Mexican-Americans and Latinx immigrants bring Catholic and transnational connections, but their presence has prompted xenophobia. Additionally, white Americans sought to restrict the arrival of Asian immigrants both legally and culturally. With the passing of the Immigration and Nationality Act of 1965, the religious, racial, and ethnic diversity of the United States increased further. This religious and racial pluralism in many ways reflects the diversity of America, as does the conflict that comes with it.
The Great Depression of 1929–1941 brought not only economic and social crisis, but also forced families, churches, and religious organizations to reckon with individual and social suffering in ways that they had not done in the United States since the Civil War. This reckoning introduced a period of both theological and institutional transformation. Theologians wrestled not only with the domestic depression, but also with international instability as they faced questions about pacifism, economic and racial justice, and religious persecution. Ordinary people prayed for rain and revival. Many turned to their religious communities to wrestle together with the troubles they faced, or turned from those communities in disappointment and despair.
During the decades before the Great Depression, religious institutions across the United States had expanded their charitable efforts and their social reform campaigns, but the Depression wiped out the support for that work just as Americans needed it most. The New Deal brought a new set of questions about the relative roles of church and state in welfare and reform and introduced a period of religious ferment and church–state realignment. At the same time, the discontent and dislocation that the Great Depression wrought on local communities meant that individuals, families, and communities wrestled with deep theological questions together, often in ways that fractured old religious alliances and forged new ones. For American Jews and some Catholics, events in Europe proved even more troubling than those at home, and local communities reorganized around international activism and engagement.
Dynamic and creative exchanges among different religions, including indigenous traditions, Protestant and Catholic Christianity, and Islam, all with developing theologies and institutions, fostered substantial collective religious and cultural identities within African American communities in the United States. The New World enslavement of diverse African peoples and the cultural encounter with Europeans and Native Americans produced distinctive religious perspectives that aided individuals and communities in persevering under the dehumanization of slavery and oppression. As African Americans embraced Christianity beginning in the 18th century, especially after 1770, they gathered in independent church communities and created larger denominational structures such as the African Methodist Episcopal Church, the African Methodist Episcopal Zion Church, and the National Baptist Convention. These churches and denominations became significant arenas for spiritual support, educational opportunity, economic development, and political activism. Black religious institutions served as contexts in which African Americans made meaning of the experience of enslavement, interpreted their relationship to Africa, and charted a vision for a collective future. The early 20th century saw the emergence of new religious opportunities as increasing numbers of African Americans turned to Holiness and Pentecostal churches, drawn by the focus on baptism in the Holy Spirit and enthusiastic worship that sometimes involved speaking in tongues. The Great Migration of southern blacks to southern and northern cities fostered the development of a variety of religious options outside of Christianity. Groups such as the Moorish Science Temple and the Nation of Islam, whose leaders taught that Islam was the true religion of people of African descent, and congregations of Ethiopian Hebrews promoting Judaism as the heritage of black people, were founded in this period. Early-20th-century African American religion was also marked by significant cultural developments as ministers, musicians, actors, and other performers turned to new media, such as radio, records, and film, to contribute to religious life. In the post–World War II era, religious contexts supported the emergence of the modern Civil Rights movement. Black religious leaders emerged as prominent spokespeople for the cause and others as vocal critics of the goal of racial integration, as in the case of the Nation of Islam and religious advocates of Black Power. The second half of the 20th century and the early 21st-first century saw new religious diversity as a result of immigration and cultural transformations within African American Christianity with the rise of megachurches and televangelism.
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.
Kyle B. Roberts
From Cahokia to Newport, from Santa Fe to Chicago, cities have long exerted an important influence over the development of American religion; in turn, religion has shaped the life of America’s cities. Early visions of a New Jerusalem quickly gave way to a crowded spiritual marketplace full of faiths competing for the attention of a heterogeneous mass of urban consumers, although the dream of an idealized spiritual city never completely disappeared. Pluralism fostered toleration and freedom of religious choice, but also catalyzed competition and antagonism, sometimes resulting in violence. Struggles over political authority between established and dissenting churches gave way after the American Revolution to a contest over the right to exert moral authority through reform. Secularization, the companion of modernization and urbanization, did not toll the death knell for urban religion, but instead, provided the materials with which the religious engaged the city. Negative discursive constructions of the city proffered by a handful of religious reformers have long cast a shadow over the actual urban experience of most men and women. Historians continue to uncover the rich and innovative ways in which urban religion enabled individuals to understand, navigate, and contribute to the city around them.
Christopher D. Cantwell
Home to more than half the U.S. population by 1920, cities played an important role in the development of American religion throughout the 20th century. At the same time, the beliefs and practices of religious communities also shaped the contours of America’s urban landscape. Much as in the preceding three centuries, the economic development of America’s cities and the social diversity of urban populations animated this interplay. But the explosive, unregulated expansion that defined urban growth after the Civil War was met with an equally dramatic disinvestment from urban spaces throughout the second half of the 20th century. The domestic and European migrations that previously fueled urban growth also changed throughout the century, shifting from Europe and the rural Midwest to the deep South, Africa, Asia, and Latin America after World War II. These newcomers not only brought new faiths to America’s cities but also contributed to the innovation of several new, distinctly urban religious movements. Urban development and diversity on one level promoted toleration and cooperation as religious leaders forged numerous ecumenical and, eventually, interfaith bonds to combat urban problems. But it also led to tension and conflict as religious communities busied themselves with carving out spaces of their own through tight-knit urban enclaves or new suburban locales. Contemporary American cities are some of the most religiously diverse communities in the world. Historians continue to uncover how religious communities not only have lived in but also have shaped the modern city.
Cara L. Burnidge
Since 2001, there has been a noticeable increase in the number of scholarly monographs dedicated to religion and foreign relations. More scholars and policymakers agree that religion is an important feature of foreign affairs, regardless of whether one thinks it ought to be. While policymakers and scholars often discuss “religion” as a single “lens” for understanding the world, religious traditions do not exist in isolation from the political, economic, or social and cultural aspects of life. Tracing religious influences on U.S. foreign policy, then, can lead scholars in a variety of directions. Scholars researching religious influences in foreign policy could consider theologies and creeds of religious organizations and figures, the rhetoric and rituals of national norms and civic values, the intersection of “sacred” and “secular” ideas and institutions, the service of individual policymakers and diplomats, international legal or military defenses for or against specific religious groups, or public discourse about religion, to name but a few options.
Advances in the study of religion and foreign policy will require collaboration and dialogue across traditional boundaries for disciplines, fields, and subfields. For many scholars, this means broadening research approaches and methods. Instead of prioritizing “first-” and “second-” order causes, for instance, historians and social scientists could move beyond cause-effect relationships alone, complicating U.S. foreign relations by considering intersectional experiences and interstitial explanations. Rather than looking for “the” univocal religious influence, scholars might pay greater attention to the multiplicity of “religious” influences on a given topic. This will likely occur by reading and researching beyond one specific area of expertise. It will also require attention to differentiating between institutional and “popular” or “lived” religion; recognizing the disparities between the official dogma of a religious affiliation and ethnographic and empirical data on religious practice; and giving attention to the underlying assumptions that occur when international organizations, national governments, and scholars choose to pay attention to certain forms of “religious” thought, behavior, and organizations and not others.
Jane H. Hong
Laws barring Asians from legal immigration and naturalization in the United States began with the Chinese Exclusion Act of 1882 and expanded to include all other Asian groups by 1924. Beginning in World War II, U.S. lawmakers began to dismantle the Asian exclusion regime in response to growing international pressure and scrutiny of America’s racial policies and practices. The Japanese government sought to use the U.S. Asian exclusion laws to disrupt the Sino-American alliance of World War II, causing Washington officials to recognize these laws as a growing impediment to international diplomacy and the war effort. Later, the Soviet Union and other communist powers cited U.S. exclusion policies as evidence of American racial hypocrisy during the Cold War.
A diverse group of actors championed the repeal of Asian exclusion laws over the 1940s and early 1950s. They included former American missionaries to Asia, U.S. and Asian state officials, and Asian and Asian American activists. The movement argued for repeal legislation as an inexpensive way for the United States to demonstrate goodwill, counter foreign criticism, and rehabilitate America’s international image as a liberal democracy. Drawing upon the timely language and logic of geopolitics, advocates lobbied Congressional lawmakers to pass legislation ending the racial exclusion of Asians from immigration and naturalization eligibility, in support of U.S. diplomatic and security interests abroad.