1-15 of 15 Results

  • Keywords: Native American x
Clear all

Article

The region that today constitutes the United States–Mexico borderland has evolved through various systems of occupation over thousands of years. Beginning in time immemorial, the land was used and inhabited by ancient peoples whose cultures we can only understand through the archeological record and the beliefs of their living descendants. Spain, then Mexico and the United States after it, attempted to control the borderlands but failed when confronted with indigenous power, at least until the late 19th century when American capital and police established firm dominance. Since then, borderland residents have often fiercely contested this supremacy at the local level, but the borderland has also, due to the primacy of business, expressed deep harmonies and cooperation between the U.S. and Mexican federal governments. It is a majority minority zone in the United States, populated largely by Mexican Americans. The border is both a porous membrane across which tremendous wealth passes and a territory of interdiction in which noncitizens and smugglers are subject to unusually concentrated police attention. All of this exists within a particularly harsh ecosystem characterized by extreme heat and scarce water.

Article

Death is universal yet is experienced in culturally specific ways. Because of this, when individuals in colonial North America encountered others from different cultural backgrounds, they were curious about how unfamiliar mortuary practices resembled and differed from their own. This curiosity spawned communication across cultural boundaries. The resulting knowledge sometimes facilitated peaceful relations between groups, while at other times it helped one group dominate another. Colonial North Americans endured disastrously high mortality rates caused by disease, warfare, and labor exploitation. At the same time, death was central to the religions of all residents: Indians, Africans, and Europeans. Deathways thus offer an unmatched way to understand the colonial encounter from the participants’ perspectives.

Article

The American War for Independence lasted eight years. It was one of the longest and bloodiest wars in America’s history, and yet it was not such a protracted conflict merely because the might of the British armed forces was brought to bear on the hapless colonials. The many divisions among Americans themselves over whether to fight, what to fight for, and who would do the fighting often had tragic and violent consequences. The Revolutionary War was by any measure the first American civil war. Yet national narratives of the Revolution and even much of the scholarship on the era focus more on simple stories of a contest between the Patriots and the British. Loyalists and other opponents of the Patriots are routinely left out of these narratives, or given short shrift. So, too, are the tens of thousands of ordinary colonists—perhaps a majority of the population—who were disaffected or alienated from either side or who tried to tack between the two main antagonists to make the best of a bad situation. Historians now estimate that as many as three-fifths of the colonial population were neither active Loyalists nor Patriots. When we take the war seriously and begin to think about narratives that capture the experience of the many, rather than the few, an illuminating picture emerges. The remarkably wide scope of the activities of the disaffected during the war—ranging from nonpayment of taxes to draft dodging and even to armed resistance to protect their neutrality—has to be integrated with older stories of militant Patriots and timid Loyalists. Only then can we understand the profound consequences of disaffection—particularly in creating divisions within the states, increasing levels of violence, prolonging the war, and changing the nature of the political settlements in each state. Indeed, the very divisions among diverse Americans that made the War for Independence so long, bitter, and bloody also explains much of the Revolutionary energy of the period. Though it is not as seamless as traditional narratives of the Revolution would suggest, a more complicated story also helps better explain the many problems the new states and eventually the new nation would face. In making this argument, we may finally suggest ways we can overcome what John Shy long ago noted as the tendency of scholars to separate the ‘destructive’ War for Independence from the ‘constructive’ political Revolution.

Article

In the years following the US Civil War, the federal government implemented a campaign to assimilate Native peoples into an expanding American nation and a modernizing American society. As policymakers and social reformers understood it, assimilation required a transformation in Native gender roles, and as a result, Native American women were the targets of several assimilationist initiatives. Native women navigated federal interventions strategically, embracing what was useful, accommodating what was necessary, and discarding what was not. As mothers, grandmothers, and healers, women provided stability for families and communities enduring disruption and coerced change. In the 20th century, Native women embraced new economic and political roles even as they adapted long-standing customs. Many began working for wages; although often confined to menial labor such as domestic service in other women’s homes, growing numbers of Native women also pursued white-collar occupations in the Bureau of Indian Affairs and later in tribal governments. As tribal governance evolved over the course of the century, some women obtained positions on tribal councils and tribal courts. Native women have also made intellectual contributions—as tribal members and ultimately as American citizens—to modern understandings of democracy, citizenship, sovereignty, and feminism. Since the late 20th century, Native women have been at the forefront of movements to revitalize Indigenous languages and cultures.

Article

James Taylor Carson

The European invasion of the continent to which we now refer as North America unfolded in several different ways, each with its own particular implications. Yet no matter their differences, each colonial effort drew upon the same moral, intellectual, and material premises necessary to justify and enact the dispossession of the land’s first peoples. From religious arguments about Christianity extirpating “savage devils” from New England or Jamestowners’ obsession with finding gold and precious minerals to the introduction of new species of plants and animals across the continent and imperial assertions of sovereignty, the European invasion of America touched every facet of the lives that had brought first peoples and colonizers together. Examining how first peoples represented their land and how European invaders and their later American successors countered such mapping practices with their own cartographical projections affords an important way to understand a centuries-long process of place-making and place-taking too often glossed as colonization.

Article

“Twenty and odd” Africans arrived in Virginia aboard a Dutch vessel in 1619 shortly after permanent colonization of the English Americas began. There has been significant academic debate about whether the enslavement of peoples of African descent in England’s early 17th-century colonies was an inevitable or “unthinking decision” and about the nature and degree of anti-black racism during the 17th century. The legal and social status of African peoples was more flexible at first in the English colonies than it later became. Some Africans managed to escape permanent enslavement and a few Africans, such as Anthony Johnson, even owned servants of their own. There was no legal basis for enslavement in the British Americas for the first several decades of settlement and slave and servant codes emerged only gradually. Labor systems operated by custom rather than through any legal mechanisms of coercion. Most workers in the Americas experienced degrees of coercion. In the earliest years of plantation production, peoples from Africa, Europe, and the Americas often toiled alongside each other in the fields. Large numbers of Native Americans were captured and forced to work on plantations in the English Americas and many whites worked in agricultural fields as indentured and convict laborers. There were a wide variety of different kinds of coerced labor beyond enslavement in the 17th century and ideas about racial difference had yet to become as determinative as they would later be. As the staple crop plantation system matured and became entrenched on the North American mainland in the late 17th and early 18th centuries and planters required a large and regular supply of slaves, African laborers became synonymous with large-scale plantation production. The permeable boundaries between slavery and freedom disappeared, dehumanizing racism became more entrenched and U.S.-based planters developed slave codes premised on racial distinctions and legal mechanisms of coercion that were modeled on Caribbean precedents.

Article

Emily Suzanne Clark

Religion and race provide rich categories of analysis for American history. Neither category is stable. They change, shift, and develop in light of historical and cultural contexts. Religion has played a vital role in the construction, deconstruction, and transgression of racial identities and boundaries. Race is a social concept and a means of classifying people. The “natural” and “inherent” differences between races are human constructs, social taxonomies created by cultures. In American history, the construction of racial identities and racial differences begins with the initial encounters between Europeans, Native Americans, and Africans. Access to and use of religious and political power has shaped how race has been conceived in American history. Racial categories and religious affiliations influenced how groups regarded each other throughout American history, with developments in the colonial period offering prime examples. Enslavement of Africans and their descendants, as well as conquered Native Americans, displayed the power of white Protestants. Even 19th-century American anti-Catholicism and anti-Mormonism intersected racial identifications. At the same time, just as religion has supported racial domination in American history, it also has inspired calls for self-determination among racial minorities, most notably in the 20th century. With the long shadow of slavery, the power of white supremacy, the emphasis on Native sovereignty, and the civil rights movement, much of the story of religion and race in American history focuses on Americans white, black, and red. However, this is not the whole story. Mexican-Americans and Latinx immigrants bring Catholic and transnational connections, but their presence has prompted xenophobia. Additionally, white Americans sought to restrict the arrival of Asian immigrants both legally and culturally. With the passing of the Immigration and Nationality Act of 1965, the religious, racial, and ethnic diversity of the United States increased further. This religious and racial pluralism in many ways reflects the diversity of America, as does the conflict that comes with it.

Article

Euro-Americans existed firmly on the periphery of an Indigenous North America in 1763, hubristic claims of continental sovereignty notwithstanding. Nowhere is this reality more clear than in the Ohio Valley and Illinois Country. Try as it might, the post-1763 British Empire could not assume jurisdictional control over this space. Even to begin to try was a task requiring significant investment—both in terms of more systematic Indigenous diplomacy and in terms of reforming colonial political structures unfit to accommodate imperial western policy. North American officials understood the problems quite well and were willing to spearhead reform. Between 1763 and 1775 they supported increased investment to defray North American expenses. They called for programs that would end colonial corruption, something they feared undermined Indigenous diplomacy and made a mockery of the rule of law. Ultimately, they concluded that centralizing Indian affairs offered the best means by which to stabilize North America. Colonials (generally) and speculators and their surveyor corps (specifically) powerfully disagreed, however, seeing Indian country as an untapped resource and imperial restraints as threats to local autonomy. They rejected the idea of centralizing power over Indigenous affairs and used the rhetoric of British constitutional liberty to reframe corrupt behavior into something it emphatically was not.

Article

Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world. Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.

Article

David S. Jones

Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.

Article

The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.

Article

Nicolas G. Rosenthal

An important relationship has existed between Native Americans and cities from pre-Columbian times to the early 21st century. Long before Europeans arrived in the Americas, indigenous peoples developed societies characterized by dense populations, large-scale agriculture, monumental architecture, and complex social hierarchies. Following European and American conquest and colonization, Native Americans played a crucial role in the development of towns and cities throughout North America, often on the site of former indigenous settlements. Beginning in the early 20th century, Native Americans began migrating from reservations to U.S. cities in large numbers and formed new intertribal communities. By 1970, the majority of the Native American population lived in cities and the numbers of urban American Indians have been growing ever since. Indian Country in the early 21st century continues to be influenced by the complex and evolving ties between Native Americans and cities.

Article

The United States has engaged with Indigenous nations on a government-to-government basis via federal treaties representing substantial international commitments since the origins of the republic. The first treaties sent to the Senate for ratification under the Constitution of 1789 were treaties with Indigenous nations. Treaties with Indigenous nations provided the means by which approximately one billion acres of land entered the national domain of the United States prior to 1900, at an average price of seventy-five cents per acre – the United States confiscated or claimed another billion acres of Indigenous land without compensation. Despite subsequent efforts of American federal authorities to alter these arrangements, the weight of evidence indicates that the relationship remains primarily one of a nation-to-nation association. Integration of the history of federal relations with Indigenous nations with American foreign relations history sheds important new light on the fundamental linkages between these seemingly distinct state practices from the beginnings of the American republic.

Article

Laurie Arnold

Indian gaming, also called Native American casino gaming or tribal gaming, is tribal government gaming. It is government gaming built on sovereignty and consequently is a corollary to state gambling such as lotteries rather than a corollary to corporate gaming. While the types of games offered in casinos might differ in format from ancestral indigenous games, gaming itself is a cultural tradition in many tribes, including those who operate casino gambling. Native American casino gaming is a $33.7 billion industry operated by nearly 250 distinct tribes in twenty-nine states in the United States. The Indian Gaming Regulatory Act (IGRA) of 1988 provides the framework for tribal gaming and the most important case law in Indian gaming remains Seminole Tribe of Florida v. Butterworth, in the US Fifth Circuit Court of Appeals, and the US Supreme Court decision over California v. Cabazon Band of Mission Indians.

Article

Malinda Maynor Lowery

The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination. Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.