1-6 of 6 Results  for:

  • 20th Century: Pre-1945 x
  • Environmental History x
Clear all

Article

Agriculture and Rural Life in the South, 1900–1945  

William Thomas Okie

The period from 1900 to 1945 was characterized by both surprising continuity and dramatic change in southern agriculture. Unlike the rest of the nation, which urbanized and industrialized at a rapid pace in the late nineteenth century, the South remained overwhelmingly rural and poor, from the 1880s through the 1930s. But by 1945, the region was beginning to urbanize and industrialize into a recognizably modern South, with a population concentrated in urban centers, industries taking hold, and agriculture following the larger-scale, mechanized trend common in other farming regions of the country. Three overlapping factors explain this long lag followed by rapid transformation. First, the cumulative effects of two centuries of land-extensive, staple crop agriculture and white supremacy had sapped the region of much of its fertility and limited its options for prosperity. Second, in response to this “problem South,” generations of reformers sought to modernize the South, along with other rural areas around the world. These piecemeal efforts became the foundation for the South’s dramatic transformation by federal policy known as the New Deal. Third, poor rural southerners, both black and white, left the countryside in increasing numbers. Coupled with the labor demands created by two major military conflicts, World War I and World War II, this movement aided and abetted the mechanization of agriculture and the depopulation of the rural South.

Article

Chemical and Biological Weapons Policy  

Thomas I. Faith

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Environmental and Conservation Movements in Metropolitan America  

Robert R. Gioielli

By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

Indigenous Peoples and the Environment since 1890  

Marsha Weisiger

By the late 19th century, the Indigenous peoples of what became the United States, in an effort to avoid utter genocide, had ceded or otherwise lost their land and control of their natural resources, often through treaties with the United States. Ironically, those treaties, while frequently abrogated by federal fiat, made possible a resurgence of Native nationhood beginning in the 1960s, along with the restoration of Indigenous reserved treaty rights to hunt and fish in their homelands and manage their natural resources. The history of Indigenous peoples and their environments, however, is not a single narrative but a constellation of stories that converge and diverge. Nonetheless, an analysis of the environmental histories of only a fraction of the more than 575 Indigenous groups, including Alaska Natives and Native Hawaiians reveals important trends and commonalities, including the stories of dispossession and displacement, the promise of the Indian New Deal, the trauma of the Termination Era, the reemergence of Native sovereignty based on treaty rights, and the rise of Indigenous leadership in the environmental justice movement. This article is, thus, not comprehensive but focuses on major trends and commonalities from the mid- to late 19th century through the early 21st century, with examples drawn from the environmental histories of a fraction of the more than 575 Indigenous groups, including Alaska Natives and Native Hawaiians. Topics include dispossession and displacement; the Indian New Deal; the Termination Era; the reemergence of Indigenous sovereignty based on treaty rights; the management of forests, minerals, and water; and the rise of the environmental justice movement. For the period before the establishment of reservations for Indigenous people, see “Indigenous Peoples and the Environment to 1890.”

Article

The National Parks  

Donald Worster

The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.