The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Christopher J. Castañeda
The modern oil industry began in 1859 with Edwin Drake’s discovery of oil at Titusville, Pennsylvania. Since then, this dynamic industry has experienced dramatic episodes of growth, aggressive competition for market share, various forms of corporate organization and cartel-like agreements, and governmental efforts at regulation and control, as well as monopoly, mergers, and consolidation. The history of the oil industry reflects its capital-intensive nature. Immense sums of money are spent on oil discovery, production, and refining projects. Marketing, transportation, and distribution systems likewise require enormous amounts of financing and logistical planning. Although oil is often produced in conjunction with, or in wells pressurized by, natural gas, the oil industry is distinct from the related natural gas industry. Since its origins in the mid-19th century, the oil industry has developed an industrial structure that emphasizes scale and scope to maximize profits. Profits can be huge, which attracts entrepreneurial efforts on individual, corporate, and national scales. By the late 20th through early 21st century, the oil industry had begun confronting questions about long-term viability, combined with an increasingly influential environmental movement that seeks to reduce fossil fuel consumption and prevent its toxic waste and by-products from polluting human, animal habitats, and natural habitats.
The creation and evolution of urban parks is in some ways a familiar story, especially given the attention that Frederick Law Olmsted’s work has commanded since the early 1970s. Following the success of Central Park, cities across the United States began building parks to meet the recreational needs of residents, and during the second half of the 19th century, Olmsted and his partners designed major parks or park systems in thirty cities. Yet, even that story is incomplete. To be sure, Olmsted believed that every city should have a large rural park as an alternative to the density of building and crowding of the modern metropolis, a place to provide for an “unbending of the faculties,” a process of recuperation from the stresses and strains of urban life. But, even in the mid-1860s he sought to create alternative spaces for other types of recreation. Olmsted and his partner Calvert Vaux successfully persuaded the Prospect Park commission, in Brooklyn, New York, to acquire land for a parade ground south of the park as a place for military musters and athletics; moreover, in 1868 they prepared a plan for a park system in Buffalo, New York, that consisted of three parks, linked by parkways, that served different functions and provided for different forms of recreation. As the decades progressed, Olmsted became a champion of parks designed for active recreation; gymnasiums for women as well as men, especially in working-class areas of cities; and playgrounds for small children. He did so in part to relieve pressure on the large landscape parks to accommodate uses he believed would be inappropriate, but also because he recognized the legitimate demands for new forms of recreation. In later years, other park designers and administrators would similarly add facilities for active recreation, though sometimes in ways that compromised what Olmsted considered the primary purpose of a public park. Urban parks are, in important ways, a microcosm of the nation’s cities. Battles over location, financing, political patronage, and use have been a constant. Through it all, parks have evolved to meet the changing recreational needs of residents. And, as dominant a figure as Olmsted has been, this is a story that antedates his professional career and that includes the many voices that have shaped public parks in U.S. cities in the 20th century.
Anna Rose Alexander
Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.
Adam M. Sowards
For more than a century after the republic’s founding in the 1780s, American law reflected the ideal that the commons—the public domain—should be turned into private property. As Americans became concerned about resource scarcity, waste, and monopolies at the end of the 19th century, reform-minded bureaucrats and scientists convinced Congress to maintain in perpetuity some of the nation’s land as public. This shift offered a measure of protection and an alternative to private property regimes. The federal agencies that primarily manage these lands today—U.S. Forest Service (USFS), National Park Service (NPS), U.S. Fish and Wildlife Service (USFWS), and Bureau of Land Management (BLM)—have worked since their origins in the early decades of the 20th century to fulfill their diverse, competing, evolving missions. Meanwhile, the public and Congress have continually demanded new and different goals as the land itself has functioned and responded in interdependent ways. In the mid-20th century, the agencies intensified their management, hoping they could satisfy the rising—and often conflicting—demands American citizens placed on the public lands. This intensification often worsened public lands’ ecology and increased political conflict, resulting in a series of new laws in the 1960s and 1970s. Those laws strengthened the role of science and the public in influencing agency practices while providing more opportunities for litigation. Predictably, since the late 1970s, these developments have polarized public lands’ politics. The economies, but also the identities, of many Americans remain entwined with the public lands, making political standoffs—over endangered species, oil production, privatizing land, and more—common and increasingly intractable. Because the public lands are national in scope but used by local people for all manner of economic and recreational activities, they have been and remain microcosms of the federal democratic system and all its conflicted nature.
From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.
Timothy James LeCain
Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature. Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
Brian J. McCammack
Urban areas have been the main source of pollution for centuries. The United States is no exception to this more general rule. Pollution of air, water, and soil only multiplied as cities grew in size and complexity; people generated ever more domestic waste and industry continually generated new unwanted byproducts. Periods of pollution intensification—most notably those spurts that came with late 19th-century urban industrialization and the rapid technological innovation and consumer culture of the post-World War II era—spurred social movements and scientific research on the problem, mostly as it pertained to adverse impacts on human health. Technological innovations aimed to eliminate unwanted wastes and more stringent regulations followed. Those technological and political solutions largely failed to keep pace with the increasing volume and diversity of pollutants industrial capitalism introduced into the environment, however, and rarely stopped pollution at its root cause. Instead, they often merely moved pollutants from one “sink”—a repository of pollution—to another (from water to land, for instance) and/or from one place to another (to a city downstream, for instance, or from one urban neighborhood to another). This “end of pipe” approach remained overwhelmingly predominant even as most pollution mitigation policies became nationalized in the 1970s. Prior to that, municipalities and states were primarily responsible for addressing air, water, and land pollution. During this post-World War II period, policy—driven by ecological science—began to exhibit an understanding of urban pollution’s detrimental effects beyond human health. More broadly, evolving scientific understanding of human health and ecosystemic impacts of pollution, new technology, and changing social relations within growing metropolitan areas shifted the public perception of pollution’s harmful impacts. Scientific understanding of how urban and suburban residents risked ill health when exposed to polluted water, air, and soil grew, as did the social understanding of who was most vulnerable to these hazards. From the nation’s founding, the cumulative impact of both urban exposure to pollutants and attempts to curb that exposure has been unequal along lines of race and ethnicity, class, and gender. Despite those consistent inequalities, the 21st-century American city looks little like the 18th-century American city, whether in terms of population size, geographical footprint, demographics, economic activity, or the policies that governed them: all of these factors influenced the very definitions of ideas such as pollution and the urban.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort. Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy. Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.
“Working-Class Environmentalism in America” traces working Americans’ efforts to protect the environment from antebellum times to the present. Antebellum topics include African American slaves’ environmental ethos; aesthetic nature appreciation by Lowell, Massachusetts “mill girls” working in New England’s first textile factories; and Boston’s 1840s fight for safe drinking water. Late-19th-century topics include working-class support for creating urban parks, workers’ early efforts to confront urban pollution and the “smoke nuisance,” and the exploration of conservationist ideas and policies by New England small farmers and fishermen in the late 1800s. In the early 20th century, working-class youth, including immigrants and African Americans, participated in the youth camping movement and the Boy Scouts and Girl Scouts of America, while working-class adults and their families, enjoying new automobility and two-day weekends, discovered picnicking, car-camping, and sport hunting and fishing in newly created wilderness preserves. Workers also learned of toxic dangers to workplace safety and health from shocking stories of 1920s New Jersey “radium girls” and tetraethyl lead factory workers, and from 1930s Midwestern miners who went on strike over deadly silicosis. The 1930s United States rediscovered natural resource conservation when the Civilian Conservation Corps (CCC) employed millions of working-class youth. Lumber workers advocated federal regulation of timber harvesting. Postwar America saw the United Auto Workers (UAW), United Steelworkers (USWA), Oil Chemical and Atomic Workers (OCAW), American Federation of Labor and Congress of Industrial Organizations (AFL-CIO), and other labor unions lobbying for wilderness and wildlife preservation, workplace and community health, and fighting air and water pollution, while the United Farmworkers (UFW) fought reckless pesticide use, and dissidents within the United Mine Workers (UMW) sought to ban surface coal mining. Radical organizations explored minority community environmentalism and interracial cooperation on environmental reform. Following post-1970s nationwide conservative retrenchment, working-class activists and communities of color fought toxic wastes and explored environmental justice and environmental racism at places like Love Canal, New York and Warren County, North Carolina and formed the Blue-Green Alliance with environmentalists.