581-590 of 590 Results

Article

Religion in African American History  

Judith Weisenfeld

Dynamic and creative exchanges among different religions, including indigenous traditions, Protestant and Catholic Christianity, and Islam, all with developing theologies and institutions, fostered substantial collective religious and cultural identities within African American communities in the United States. The New World enslavement of diverse African peoples and the cultural encounter with Europeans and Native Americans produced distinctive religious perspectives that aided individuals and communities in persevering under the dehumanization of slavery and oppression. As African Americans embraced Christianity beginning in the 18th century, especially after 1770, they gathered in independent church communities and created larger denominational structures such as the African Methodist Episcopal Church, the African Methodist Episcopal Zion Church, and the National Baptist Convention. These churches and denominations became significant arenas for spiritual support, educational opportunity, economic development, and political activism. Black religious institutions served as contexts in which African Americans made meaning of the experience of enslavement, interpreted their relationship to Africa, and charted a vision for a collective future. The early 20th century saw the emergence of new religious opportunities as increasing numbers of African Americans turned to Holiness and Pentecostal churches, drawn by the focus on baptism in the Holy Spirit and enthusiastic worship that sometimes involved speaking in tongues. The Great Migration of southern blacks to southern and northern cities fostered the development of a variety of religious options outside of Christianity. Groups such as the Moorish Science Temple and the Nation of Islam, whose leaders taught that Islam was the true religion of people of African descent, and congregations of Ethiopian Hebrews promoting Judaism as the heritage of black people, were founded in this period. Early-20th-century African American religion was also marked by significant cultural developments as ministers, musicians, actors, and other performers turned to new media, such as radio, records, and film, to contribute to religious life. In the post–World War II era, religious contexts supported the emergence of the modern Civil Rights movement. Black religious leaders emerged as prominent spokespeople for the cause and others as vocal critics of the goal of racial integration, as in the case of the Nation of Islam and religious advocates of Black Power. The second half of the 20th century and the early 21st-first century saw new religious diversity as a result of immigration and cultural transformations within African American Christianity with the rise of megachurches and televangelism.

Article

Spatial Segregation and Neighborhoods  

Carl Nightingale

During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.

Article

Nightlife in the City  

Peter C. Baldwin

Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.

Article

The Vietnam War and American Military Strategy, 1965–1973  

Gregory A. Daddis

For nearly a decade, American combat soldiers fought in South Vietnam to help sustain an independent, noncommunist nation in Southeast Asia. After U.S. troops departed in 1973, the collapse of South Vietnam in 1975 prompted a lasting search to explain the United States’ first lost war. Historians of the conflict and participants alike have since critiqued the ways in which civilian policymakers and uniformed leaders applied—some argued misapplied—military power that led to such an undesirable political outcome. While some claimed U.S. politicians failed to commit their nation’s full military might to a limited war, others contended that most officers fundamentally misunderstood the nature of the war they were fighting. Still others argued “winning” was essentially impossible given the true nature of a struggle over Vietnamese national identity in the postcolonial era. On their own, none of these arguments fully satisfy. Contemporary policymakers clearly understood the difficulties of waging a war in Southeast Asia against an enemy committed to national liberation. Yet the faith of these Americans in their power to resolve deep-seated local and regional sociopolitical problems eclipsed the possibility there might be limits to that power. By asking military strategists to simultaneously fight a war and build a nation, senior U.S. policymakers had asked too much of those crafting military strategy to deliver on overly ambitious political objectives. In the end, the Vietnam War exposed the limits of what American military power could achieve in the Cold War era.

Article

Epidemics in Indian Country  

David S. Jones

Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.

Article

Federal Indian Law  

N. Bruce Duthu

United States law recognizes American Indian tribes as distinct political bodies with powers of self-government. Their status as sovereign entities predates the formation of the United States and they are enumerated in the U.S. Constitution as among the subjects (along with foreign nations and the several states) with whom Congress may engage in formal relations. And yet, despite this long-standing recognition, federal Indian law remains curiously ambivalent, even conflicted, about the legal and political status of Indian tribes within the U.S. constitutional structure. On the one hand, tribes are recognized as sovereign bodies with powers of self-government within their lands. On the other, long-standing precedents of the Supreme Court maintain that Congress possesses plenary power over Indian tribes, with authority to modify or even eliminate their powers of self-government. These two propositions are in tension with one another and are at the root of the challenges faced by political leaders and academics alike in trying to understand and accommodate the tribal rights to self-government. The body of laws that make up the field of federal Indian law include select provisions of the U.S. Constitution (notably the so-called Indian Commerce Clause), treaties between the United States and various Indian tribes, congressional statutes, executive orders, regulations, and a complex and rich body of court decisions dating back to the nation’s formative years. The noted legal scholar Felix Cohen brought much-needed coherence and order to this legal landscape in the 1940s when he led a team of scholars within the Office of the Solicitor in the Department of the Interior to produce a handbook on federal Indian law. The revised edition of Cohen’s Handbook of Federal Indian Law is still regarded as the seminal treatise in the field. Critically, however, this rich body of law only hints at the real story in federal Indian law. The laws themselves serve as historical and moral markers in the ongoing clash between indigenous and nonindigenous societies and cultures still seeking to establish systems of peaceful coexistence in shared territories. It is a story about the limits of legal pluralism and the willingness of a dominant society and nation to acknowledge and honor its promises to the first inhabitants and first sovereigns.

Article

Indian Slavery  

Christina Snyder

The history of American slavery began long before the first Africans arrived at Jamestown in 1619. Evidence from archaeology and oral tradition indicates that for hundreds, perhaps thousands, of years prior, Native Americans had developed their own forms of bondage. This fact should not be surprising, for most societies throughout history have practiced slavery. In her cross-cultural and historical research on comparative captivity, Catherine Cameron found that bondspeople composed 10 percent to 70 percent of the population of most societies, lending credence to Seymour Drescher’s assertion that “freedom, not slavery, was the peculiar institution.” If slavery is ubiquitous, however, it is also highly variable. Indigenous American slavery, rooted in warfare and diplomacy, was flexible, often offering its victims escape through adoption or intermarriage, and it was divorced from racial ideology, deeming all foreigners—men, women, and children, of whatever color or nation—potential slaves. Thus, Europeans did not introduce slavery to North America. Rather, colonialism brought distinct and evolving notions of bondage into contact with one another. At times, these slaveries clashed, but they also reinforced and influenced one another. Colonists, who had a voracious demand for labor and export commodities, exploited indigenous networks of captive exchange, producing a massive global commerce in Indian slaves. This began with the second voyage of Christopher Columbus in 1495 and extended in some parts of the Americas through the twentieth century. During this period, between 2 and 4 million Indians were enslaved. Elsewhere in the Americas, Indigenous people adapted Euro-American forms of bondage. In the Southeast, an elite class of Indians began to hold African Americans in transgenerational slavery and, by 1800, developed plantations that rivaled those of their white neighbors. The story of Native Americans and slavery is complicated: millions were victims, some were masters, and the nature of slavery changed over time and varied from one place to another. A significant and long overlooked aspect of American history, Indian slavery shaped colonialism, exacerbated Native population losses, figured prominently in warfare and politics, and influenced Native and colonial ideas about race and identity.

Article

The Separation of Church and State in the United States  

Steven K. Green

Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

Racial Diversity and Suburban Politics in 20th-Century Los Angeles  

Hillary Jenks

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. Despite its cultivated reputation as the nation’s “white spot” in the early 20th century, Southern California was in fact home to diverse and numerous communities of color, some composed of relatively new immigrants and some long predating the era of Anglo settlement and conquest. In the years following World War II, the region engaged in suburban home construction on a mass scale and became a global symbol of what Dolores Hayden called the economically democratic but racially exclusive “sitcom suburb,” from the tax-lowering mechanism of its “Lakewood plan” to the car-friendly “Googie” architecture of the San Fernando Valley. Existing suburban communities of color, such as the colonias of agricultural laborers, were engulfed by new settlements, while upwardly mobile African Americans, Latinas/Latinos, and Asian Americans sought access to the expanding suburban dream of homeownership, with varying degrees of success. The political responses to suburban diversity in metropolitan Los Angeles ranged from Anglo resistance and flight to multiracial political coalitions and the incorporation of people of color at multiple levels of local government. The ascent by a number of suburbanites of color to positions of local and regional political power from the 1960s through the 1980s sometimes exposed intra-ethnic discord and sometimes the fragility of cross-race coalition as multiple actors sought to protect property values and to pursue economic security within the competitive constraints of shrinking municipal resources, aging infrastructure, and a receding suburban fringe. As a result, political conflicts over crime, immigration, education, and inequality emerged in many Los Angeles County suburbs by the 1970s and later in the more distant corporate suburbs of Orange, Ventura, Riverside, and San Bernardino Counties. The suburbanization of poverty, the role of suburbs as immigrant gateways, and the emergence of “majority-minority” suburbs—all national trends by the late 1990s and the first decade of the 20th century—were evident far earlier in the Los Angeles metropolitan region, where diverse suburbanites negotiated social and economic crises and innovated political responses.