American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.
Robert R. Gioielli
By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.
Joel A. Tarr
Urban water supply and sewage disposal facilities are critical parts of the urban infrastructure. They have enabled cities and their metropolitan areas to function as centers of commerce, industry, entertainment, and human habitation. The evolution of water supply and sewage disposal systems in American cities from 1800 to 2015 is examined, with a focus on major turning points especially in regard to technological decisions, public policy, and environmental and public health issues.
The creation and evolution of urban parks is in some ways a familiar story, especially given the attention that Frederick Law Olmsted’s work has commanded since the early 1970s. Following the success of Central Park, cities across the United States began building parks to meet the recreational needs of residents, and during the second half of the 19th century, Olmsted and his partners designed major parks or park systems in thirty cities. Yet, even that story is incomplete. To be sure, Olmsted believed that every city should have a large rural park as an alternative to the density of building and crowding of the modern metropolis, a place to provide for an “unbending of the faculties,” a process of recuperation from the stresses and strains of urban life. But, even in the mid-1860s he sought to create alternative spaces for other types of recreation. Olmsted and his partner Calvert Vaux successfully persuaded the Prospect Park commission, in Brooklyn, New York, to acquire land for a parade ground south of the park as a place for military musters and athletics; moreover, in 1868 they prepared a plan for a park system in Buffalo, New York, that consisted of three parks, linked by parkways, that served different functions and provided for different forms of recreation. As the decades progressed, Olmsted became a champion of parks designed for active recreation; gymnasiums for women as well as men, especially in working-class areas of cities; and playgrounds for small children. He did so in part to relieve pressure on the large landscape parks to accommodate uses he believed would be inappropriate, but also because he recognized the legitimate demands for new forms of recreation. In later years, other park designers and administrators would similarly add facilities for active recreation, though sometimes in ways that compromised what Olmsted considered the primary purpose of a public park. Urban parks are, in important ways, a microcosm of the nation’s cities. Battles over location, financing, political patronage, and use have been a constant. Through it all, parks have evolved to meet the changing recreational needs of residents. And, as dominant a figure as Olmsted has been, this is a story that antedates his professional career and that includes the many voices that have shaped public parks in U.S. cities in the 20th century.
Anna Rose Alexander
Fires have plagued American cities for centuries. During the 18th century, the Great Fire of Boston (1760), the First Great Fire of New York City (1776), the First Great New Orleans Fire (1788), and the Great Fire of Savannah (1796) each destroyed hundreds of buildings and challenged municipal authorities to improve safety in an increasingly risky environment. Beginning in the 19th century, with increasing commerce, rapid urbanization, and the rise of industrial capitalism, fires became more frequent and destructive. Several initiatives sought to reduce the risk of fire: volunteer fire companies emerged in all major cities, fire insurance developed to help economic recovery, and municipal infrastructure like fire hydrants became ubiquitous to combat blazes. Despite significant efforts to curb this growing urban problem, fire dangers increased in the late 19th century as cities became epicenters of industry and the populations boomed. The “great” fires of the late 19th century, like those that took place in Chicago (1871), Boston (1872), Seattle (1889), Baltimore (1904), and San Francisco (1906), fundamentally altered cities. The fires not only destroyed buildings and took lives, but they also unearthed deep-rooted social tensions. Rebuilding in the aftermath of fire further exacerbated inequalities and divided cities. While fire loss tapered off after 1920, other issues surrounding urban fires heated up. The funneling of resources to suburbs in the post-war white-flight period left inner cities ill-equipped to handle serious conflagrations. In last few decades, suburban sprawl has created exurban fire regimes, where wildfires collide with cities. Extreme weather events, dependence on fossil fuels, deregulation of risky industries, and a lack of safe and affordable housing has put American metropolitan areas on a path to experience another period of “great” fires like those of the late 19th and 20th centuries.
Brian J. McCammack
Urban areas have been the main source of pollution for centuries. The United States is no exception to this more general rule. Pollution of air, water, and soil only multiplied as cities grew in size and complexity; people generated ever more domestic waste and industry continually generated new unwanted byproducts. Periods of pollution intensification—most notably those spurts that came with late 19th-century urban industrialization and the rapid technological innovation and consumer culture of the post-World War II era—spurred social movements and scientific research on the problem, mostly as it pertained to adverse impacts on human health. Technological innovations aimed to eliminate unwanted wastes and more stringent regulations followed. Those technological and political solutions largely failed to keep pace with the increasing volume and diversity of pollutants industrial capitalism introduced into the environment, however, and rarely stopped pollution at its root cause. Instead, they often merely moved pollutants from one “sink”—a repository of pollution—to another (from water to land, for instance) and/or from one place to another (to a city downstream, for instance, or from one urban neighborhood to another). This “end of pipe” approach remained overwhelmingly predominant even as most pollution mitigation policies became nationalized in the 1970s. Prior to that, municipalities and states were primarily responsible for addressing air, water, and land pollution. During this post-World War II period, policy—driven by ecological science—began to exhibit an understanding of urban pollution’s detrimental effects beyond human health. More broadly, evolving scientific understanding of human health and ecosystemic impacts of pollution, new technology, and changing social relations within growing metropolitan areas shifted the public perception of pollution’s harmful impacts. Scientific understanding of how urban and suburban residents risked ill health when exposed to polluted water, air, and soil grew, as did the social understanding of who was most vulnerable to these hazards. From the nation’s founding, the cumulative impact of both urban exposure to pollutants and attempts to curb that exposure has been unequal along lines of race and ethnicity, class, and gender. Despite those consistent inequalities, the 21st-century American city looks little like the 18th-century American city, whether in terms of population size, geographical footprint, demographics, economic activity, or the policies that governed them: all of these factors influenced the very definitions of ideas such as pollution and the urban.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort. Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy. Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.