1-10 of 35 Results  for:

  • Environmental History x
Clear all

Article

Mining and the Environment  

George Vrtis and Kirke Elsass

Human societies have always depended upon minerals. Even a cursory glance at any world history textbook reveals the centrality of minerals—and thus mining—in forging human cultures in antiquity. The stone age, the bronze age, the iron age—each of these complex periods are, in part, defined by the increasing human engagement with the minerals used to characterize them. In the history of the United States, mineral extraction has also been fundamental to the nature and evolution of the nation, as well as its colonial and Indigenous pasts. It was present in Native American societies prior to European contact, accelerated with European colonization, became foundational with the advent of the industrial revolution, was renegotiated with the rise of the environmental movement, and has influenced the United States’ relationship with many parts of the world. As these developments have unfolded, mining and mineral use have radically altered the natural world and profoundly shaped people’s relationships with the environment and with one another.

Article

El Paso  

Alberto Wilson

El Paso, Texas, sits on the northern bank of the Rio Grande along the international boundary between Mexico and the United States and the states of Texas, New Mexico, and Chihuahua. Its location makes El Paso a major urban center in the US Southwest and a key border city, and together with Ciudad Juárez, Chihuahua, the cities comprise the largest border metroplex in the western hemisphere. Occupying formerly Mansos and Suma lands, the collision between Spanish imperial design and native stewardship began in the mid-17th century as civil and religious authorities from New Mexico established a southern settlement along the river to provide a place of rest and security for the trade and travel making its way from the mineral-rich regions of New Spain to the far-flung colony. Initial settlement patterns in El Paso occurred on the southern bank of the river in what is early 21st-century Ciudad Juárez due to seasonal flooding, which provided a natural barrier from Apache raids. El Paso remained a crossroads into the national period of the 19th century as the settlements began to experience the expansion of state power and market relations in North America. The competing national designs of Mexico and the United States collided in war from 1846 to 1848, resulting in the redrawing of national borders that turned El Paso and Ciudad Juárez into border cities. In the 20th century, industrial capitalism, migration, and state power linked these peripheral cities to national and international markets, and El Paso–Ciudad Juárez became the largest binational, bicultural community along the US–Mexico border. In 2020, the decennial census of Mexico and the United States counted a combined 2.5 million residents in the region, with over eight hundred thousand of those residing in El Paso.

Article

Indigenous Peoples and the Environment to 1890  

Marsha Weisiger

Indigenous peoples have had profound spiritual and ethical relationships with their environments, but they necessarily altered ecosystems as they fed, clothed, and sheltered themselves and traded goods, long before European colonists arrived. Their impacts became broader in scope and scale under settler colonialism, which corrupted and constrained their environmental relationships. The history of Indigenous peoples and their environments, to be sure, is not a single narrative but a constellation of stories that converge and diverge. Nonetheless, an analysis of the environmental histories of only a fraction of the more than 575 Indigenous groups, including Alaska Natives and Native Hawaiians, reveals major trends and commonalities. The environmental historiography of the First Peoples from their beginnings in what is now the United States roughly through the 19th century provides an opportunity to address such topics as the myth of the “Ecological Indian,” ancient urban societies, the introduction of European livestock and disease, and subsistence through agriculture, hunting, and fishing. The history of dispossession in the late 19th century and the environmental history of Indigenous peoples in the recent era may be found in “Indigenous Peoples and the Environment since 1890.”

Article

Indigenous Peoples and the Environment since 1890  

Marsha Weisiger

By the late 19th century, the Indigenous peoples of what became the United States, in an effort to avoid utter genocide, had ceded or otherwise lost their land and control of their natural resources, often through treaties with the United States. Ironically, those treaties, while frequently abrogated by federal fiat, made possible a resurgence of Native nationhood beginning in the 1960s, along with the restoration of Indigenous reserved treaty rights to hunt and fish in their homelands and manage their natural resources. The history of Indigenous peoples and their environments, however, is not a single narrative but a constellation of stories that converge and diverge. Nonetheless, an analysis of the environmental histories of only a fraction of the more than 575 Indigenous groups, including Alaska Natives and Native Hawaiians reveals important trends and commonalities, including the stories of dispossession and displacement, the promise of the Indian New Deal, the trauma of the Termination Era, the reemergence of Native sovereignty based on treaty rights, and the rise of Indigenous leadership in the environmental justice movement. This article is, thus, not comprehensive but focuses on major trends and commonalities from the mid- to late 19th century through the early 21st century, with examples drawn from the environmental histories of a fraction of the more than 575 Indigenous groups, including Alaska Natives and Native Hawaiians. Topics include dispossession and displacement; the Indian New Deal; the Termination Era; the reemergence of Indigenous sovereignty based on treaty rights; the management of forests, minerals, and water; and the rise of the environmental justice movement. For the period before the establishment of reservations for Indigenous people, see “Indigenous Peoples and the Environment to 1890.”

Article

Climate and Climate Change in Early America  

Matthew Mulcahy

European colonization of eastern North America and the Caribbean occurred against the backdrop of the Little Ice Age (LIA), a period between roughly 1300 and 1850 ce that witnessed generally colder conditions than in earlier and later centuries. Alone or in combination, shorter growing seasons associated with colder temperatures and periods of intense drought influenced Indigenous societies prior to the arrival of Europeans, interactions and conflicts between Europeans and Native Americans, and the development of colonial societies across the 16th, 17th, and 18th centuries. Farther south in the Caribbean region, climatic threats such as hurricanes and droughts created distinct challenges to colonists as they sought to establish large-scale plantations worked by enslaved Africans. Such threats forced Europeans to alter their expectations and perceptions of the climate in North America and the Caribbean. Following the principle that locations at the same latitude would have the same climate, Europeans had anticipated that places like Virginia would have a climate similar to Spain’s, but that was not the case. As they adjusted to new American climate realities, colonists remained confident they could change the climate for the better. Far from a threat, human-induced climate change seemed to many colonists a desirable goal, one that marked the degree to which they might improve and civilize the “wilderness” of the New World. However, colonists also became aware of some negative consequences associated with their activities.

Article

Environmental History of New England  

Richard Judd

New England’s first human inhabitants arrived around 12,000 years ago and adopted a nomadic life in response to a rapidly changing postglacial environment. They were followed by Archaic and Woodland cultures, the latter innovating a form of corn-beans-squash cultivation called “three sisters.” European colonists appeared first in small fishing and fur-trading posts and then in larger numbers at Plymouth and Massachusetts Bay. The nascent fur-trading farming, fishing, and logging economies disrupted regional ecosystems. Colonization weakened Native society through epidemics, ecological disruptions, enslavement, and wars, and yet Indigenous people persevered in family bands and small communities and sustained their identity through extended kinship ties. English husbandry shifted gradually to market production after the American Revolution, which brought further ecological disruptions. The early 19th century saw the rise of equally intrusive fishing and logging practices, which were exaggerated at century’s end by the introduction of pulp and paper production, marine engines, and new trawling equipment. New England’s Industrial Revolution began in the 1790s in the Blackstone Valley and spread from there into central New England, where more forceful rivers gave rise to gigantic textile mills. The cultural disorientation brought on by industrialization triggered the Romantic movement, epitomized by Transcendentalist discourse on the truths intuited through the contemplation of nature. The Romantic recasting of nature provided intellectual impetus for pioneering fisheries- and forest-conservation efforts. In cities, conservation brought, among other things, landscaped parks such as Boston’s Emerald Necklace. Mirroring its approach to conservation, New England pioneered several forms of environmental activism, including private land trusts, cultural landscape preservation, heritage parks, and environmental justice movements. New England “re-wilded” several of its rivers by removing dams to renew migratory fish runs.

Article

Agriculture and Rural Life in the South, 1900–1945  

William Thomas Okie

The period from 1900 to 1945 was characterized by both surprising continuity and dramatic change in southern agriculture. Unlike the rest of the nation, which urbanized and industrialized at a rapid pace in the late nineteenth century, the South remained overwhelmingly rural and poor, from the 1880s through the 1930s. But by 1945, the region was beginning to urbanize and industrialize into a recognizably modern South, with a population concentrated in urban centers, industries taking hold, and agriculture following the larger-scale, mechanized trend common in other farming regions of the country. Three overlapping factors explain this long lag followed by rapid transformation. First, the cumulative effects of two centuries of land-extensive, staple crop agriculture and white supremacy had sapped the region of much of its fertility and limited its options for prosperity. Second, in response to this “problem South,” generations of reformers sought to modernize the South, along with other rural areas around the world. These piecemeal efforts became the foundation for the South’s dramatic transformation by federal policy known as the New Deal. Third, poor rural southerners, both black and white, left the countryside in increasing numbers. Coupled with the labor demands created by two major military conflicts, World War I and World War II, this movement aided and abetted the mechanization of agriculture and the depopulation of the rural South.

Article

The Problem of Fire in the American City, 1750–Present  

Anna Rose Alexander

Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.

Article

Appalachian War on Poverty and the Working Class  

Jessica Wilkerson

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.

Article

Chemical and Biological Weapons Policy  

Thomas I. Faith

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.