1-20 of 35 Results  for:

  • Environmental History x
Clear all

Article

Agriculture and Rural Life in the South, 1900–1945  

William Thomas Okie

The period from 1900 to 1945 was characterized by both surprising continuity and dramatic change in southern agriculture. Unlike the rest of the nation, which urbanized and industrialized at a rapid pace in the late nineteenth century, the South remained overwhelmingly rural and poor, from the 1880s through the 1930s. But by 1945, the region was beginning to urbanize and industrialize into a recognizably modern South, with a population concentrated in urban centers, industries taking hold, and agriculture following the larger-scale, mechanized trend common in other farming regions of the country. Three overlapping factors explain this long lag followed by rapid transformation. First, the cumulative effects of two centuries of land-extensive, staple crop agriculture and white supremacy had sapped the region of much of its fertility and limited its options for prosperity. Second, in response to this “problem South,” generations of reformers sought to modernize the South, along with other rural areas around the world. These piecemeal efforts became the foundation for the South’s dramatic transformation by federal policy known as the New Deal. Third, poor rural southerners, both black and white, left the countryside in increasing numbers. Coupled with the labor demands created by two major military conflicts, World War I and World War II, this movement aided and abetted the mechanization of agriculture and the depopulation of the rural South.

Article

Agriculture, Food, and the Environment  

Kathleen A. Brosnan and Jacob Blackwell

Throughout history, food needs bonded humans to nature. The transition to agriculture constituted slow, but revolutionary ecological transformations. After 1500 ce, agricultural goods, as well as pests that undermined them, dominated the exchange of species between four continents. In the United States, increasingly more commercial efforts simplified ecosystems. Improved technologies and market mechanisms facilitated surpluses in the 19th century that fueled industrialization and urbanization. In the 20th century, industrial agriculture involved expensive machinery and chemical pesticides and fertilizers in pursuit of higher outputs and profits, while consumers’ relations with their food sources and nature became attenuated.

Article

The American Antinuclear Movement  

Paul Rubinson

Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.

Article

American Environmental Diplomacy  

Kurk Dorsey

From its inception as a nation in 1789, the United States has engaged in an environmental diplomacy that has included attempts to gain control of resources, as well as formal diplomatic efforts to regulate the use of resources shared with other nations and peoples. American environmental diplomacy has sought to gain control of natural resources, to conserve those resources for the future, and to protect environmental amenities from destruction. As an acquirer of natural resources, the United States has focused on arable land as well as on ocean fisheries, although around 1900, the focus on ocean fisheries turned into a desire to conserve marine resources from unregulated harvesting. The main 20th-century U.S. goal was to extend beyond its borders its Progressive-era desire to utilize resources efficiently, meaning the greatest good for the greatest number for the longest time. For most of the 20th century, the United States was the leader in promoting global environmental protection through the best science, especially emphasizing wildlife. Near the end of the century, U.S. government science policy was increasingly out of step with global environmental thinking, and the United States often found itself on the outside. Most notably, the attempts to address climate change moved ahead with almost every country in the world except the United States. While a few monographs focus squarely on environmental diplomacy, it is safe to say that historians have not come close to tapping the potential of the intersection of the environmental and diplomatic history of the United States.

Article

American Environmental Policy Since 1964  

Richard N. L. Andrews

Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources. These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.” From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them. With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.

Article

Appalachian War on Poverty and the Working Class  

Jessica Wilkerson

In 1964, President Lyndon B. Johnson announced an unconditional “war on poverty.” On one of his first publicity tours promoting his antipoverty legislation, he traveled to cities and towns in Appalachia, which would become crucial areas for promoting and implementing the legislation. Johnson soon signed the Economic Opportunity Act, a piece of legislation that provided a structure for communities to institute antipoverty programs, from vocational services to early childhood education programs, and encouraged the creation of new initiatives. In 1965, Johnson signed the Appalachian Regional Development Act, making Appalachia the only region targeted by federal antipoverty legislation, through the creation of the Appalachian Regional Commission. The Appalachian War on Poverty can be described as a set of policies created by governmental agencies, but also crucial to it was a series of community movements and campaigns, led by working-class people, that responded to antipoverty policies. When the War on Poverty began, the language of policymakers suggested that people living below the poverty line would be served by the programs. But as the antipoverty programs expanded and more local people became involved, they spoke openly and in political terms about poverty as a working-class issue. They drew attention to the politics of class in the region, where elites and absentee landowners became wealthy on the backs of working people. They demanded meaningful participation in shaping the War on Poverty in their communities, and, increasingly, when they used the term “poor people,” they did so as a collective class identity—working people who were poor due to a rigged economy. While many public officials focused on economic development policies, men and women living in the region began organizing around issues ranging from surface mining to labor rights and responding to poor living and working conditions. Taking advantage of federal antipoverty resources and the spirit of change that animated the 1960s, working-class Appalachians would help to shape the antipoverty programs at the local and regional level, creating a movement in the process. They did so as they organized around issues—including the environment, occupational safety, health, and welfare rights—and as they used antipoverty programs as a platform to address the systemic inequalities that plagued many of their communities.

Article

Cesar Chavez and the United Farm Workers Movement  

Matt Garcia

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Article

Chemical and Biological Weapons Policy  

Thomas I. Faith

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Civilian Nuclear Power  

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

Climate and Climate Change in Early America  

Matthew Mulcahy

European colonization of eastern North America and the Caribbean occurred against the backdrop of the Little Ice Age (LIA), a period between roughly 1300 and 1850 ce that witnessed generally colder conditions than in earlier and later centuries. Alone or in combination, shorter growing seasons associated with colder temperatures and periods of intense drought influenced Indigenous societies prior to the arrival of Europeans, interactions and conflicts between Europeans and Native Americans, and the development of colonial societies across the 16th, 17th, and 18th centuries. Farther south in the Caribbean region, climatic threats such as hurricanes and droughts created distinct challenges to colonists as they sought to establish large-scale plantations worked by enslaved Africans. Such threats forced Europeans to alter their expectations and perceptions of the climate in North America and the Caribbean. Following the principle that locations at the same latitude would have the same climate, Europeans had anticipated that places like Virginia would have a climate similar to Spain’s, but that was not the case. As they adjusted to new American climate realities, colonists remained confident they could change the climate for the better. Far from a threat, human-induced climate change seemed to many colonists a desirable goal, one that marked the degree to which they might improve and civilize the “wilderness” of the New World. However, colonists also became aware of some negative consequences associated with their activities.

Article

Climate Change and the American City  

Andrew Hurley

American cities developed under relatively quiescent climatic conditions. A gradual rise in average global temperatures during the 19th and 20th centuries had a negligible impact on how urban Americans experienced the weather. Much more significant were the dramatic changes in urban form and social organization that meditated the relationship between routine weather fluctuations and the lives of city dwellers. Overcoming weather-related impediments to profit, comfort, and good health contributed to many aspects of urbanization, including population migration to Sunbelt locations, increased reliance on fossil fuels, and comprehensive re-engineering of urban hydrological systems. Other structural shifts such as sprawling development, intensification of the built environment, socioeconomic segregation, and the tight coupling of infrastructural networks were less directly responsive to weather conditions but nonetheless profoundly affected the magnitude and social distribution of weather-related risks. Although fatalities resulting from extreme meteorological events declined in the 20th century, the scale of urban disruption and property damage increased. In addition, social impacts became more concentrated among poorer Americans, including many people of color, as Hurricane Katrina tragically demonstrated in 2005. Through the 20th century, cities responded to weather hazards through improved forecasting and systematic planning for relief and recovery rather than alterations in metropolitan design. In recent decades, however, growing awareness and concern about climate change impacts have made volatile weather more central to urban planning.

Article

DDT and Pesticides  

Frederick Rowe Davis

The history of DDT and pesticides in America is overshadowed by four broad myths. The first myth suggests that DDT was the first insecticide deployed widely by American farmers. The second indicates that DDT was the most toxic pesticide to wildlife and humans alike. The third myth assumes that Rachel Carson’s Silent Spring (1962) was an exposé of the problems of DDT rather than a broad indictment of American dependency on chemical insecticides. The fourth and final myth reassures Americans that the ban on DDT late in 1972 resolved the pesticide paradox in America. Over the course of the 20th century, agricultural chemists have developed insecticides from plants with phytotoxic properties (“botanical” insecticides) and a range of chemicals including heavy metals such as lead and arsenic, chlorinated hydrocarbons like DDT, and organophosphates like parathion. All of the synthetic insecticides carried profound unintended consequences for landscapes and wildlife alike. More recently, chemists have returned to nature and developed chemical analogs of the botanical insecticides, first with the synthetic pyrethroids and now with the neonicotinoids. Despite recent introduction, neonics have become widely used in agriculture and there are suspicions that these chemicals contribute to declines in bees and grassland birds.

Article

El Paso  

Alberto Wilson

El Paso, Texas, sits on the northern bank of the Rio Grande along the international boundary between Mexico and the United States and the states of Texas, New Mexico, and Chihuahua. Its location makes El Paso a major urban center in the US Southwest and a key border city, and together with Ciudad Juárez, Chihuahua, the cities comprise the largest border metroplex in the western hemisphere. Occupying formerly Mansos and Suma lands, the collision between Spanish imperial design and native stewardship began in the mid-17th century as civil and religious authorities from New Mexico established a southern settlement along the river to provide a place of rest and security for the trade and travel making its way from the mineral-rich regions of New Spain to the far-flung colony. Initial settlement patterns in El Paso occurred on the southern bank of the river in what is early 21st-century Ciudad Juárez due to seasonal flooding, which provided a natural barrier from Apache raids. El Paso remained a crossroads into the national period of the 19th century as the settlements began to experience the expansion of state power and market relations in North America. The competing national designs of Mexico and the United States collided in war from 1846 to 1848, resulting in the redrawing of national borders that turned El Paso and Ciudad Juárez into border cities. In the 20th century, industrial capitalism, migration, and state power linked these peripheral cities to national and international markets, and El Paso–Ciudad Juárez became the largest binational, bicultural community along the US–Mexico border. In 2020, the decennial census of Mexico and the United States counted a combined 2.5 million residents in the region, with over eight hundred thousand of those residing in El Paso.

Article

Energy in American History  

Aaron Sachs

Energy systems have played a significant role in U.S. history; some scholars claim that they have determined a number of other developments. From the colonial period to the present, Americans have shifted from depending largely on wood and their own bodies, as well as the labor of draft animals; to harnessing water power; to building steam engines; to extracting fossil fuels—first coal and then oil; to distributing electrical power through a grid. Each shift has been accompanied by a number of other striking changes, especially in the modern period associated with fossil fuels. By the late 19th century, in part thanks to new energy systems, Americans were embracing industrialization, urbanization, consumerism, and, in a common contemporary phrase, “the annihilation of space and time.” Today, in the era of climate change, the focus tends to be on the production or supply side of energy systems, but a historical perspective reminds us to consider the consumption or demand side as well. Just as important as the striking of oil in Beaumont, Texas, in 1901, was the development of new assumptions about how much energy people needed to sustain their lives and how much work they could be expected to do. Clearly, Americans are still grappling with the question of whether their society’s heavy investment in coal- and petroleum-based energy systems has been worthwhile.

Article

Environmental and Conservation Movements in Metropolitan America  

Robert R. Gioielli

By the late 19th century, American cities like Chicago and New York were marvels of the industrializing world. The shock urbanization of the previous quarter century, however, brought on a host of environmental problems. Skies were acrid with coal smoke, and streams ran fetid with raw sewage. Disease outbreaks were as common as parks and green space was rare. In response to these hazards, particular groups of urban residents responded to them with a series of activist movements to reform public and private policies and practices, from the 1890s until the end of the 20th century. Those environmental burdens were never felt equally, with the working class, poor, immigrants, and minorities bearing an overwhelming share of the city’s toxic load. By the 1930s, many of the Progressive era reform efforts were finally bearing fruit. Air pollution was regulated, access to clean water improved, and even America’s smallest cities built robust networks of urban parks. But despite this invigoration of the public sphere, after World War II, for many the solution to the challenges of a dense modern city was a private choice: suburbanization. Rather than continue to work to reform and reimagine the city, they chose to leave it, retreating to the verdant (and pollution free) greenfields at the city’s edge. These moves, encouraged and subsidized by local and federal policies, provided healthier environments for the mostly white, middle-class suburbanites, but created a new set of environmental problems for the poor, working-class, and minority residents they left behind. Drained of resources and capital, cities struggled to maintain aging infrastructure and regulate remaining industry and then exacerbated problems with destructive urban renewal and highway construction projects. These remaining urban residents responded with a dynamic series of activist movements that emerged out of the social and community activism of the 1960s and presaged the contemporary environmental justice movement.

Article

Environmental History of New England  

Richard Judd

New England’s first human inhabitants arrived around 12,000 years ago and adopted a nomadic life in response to a rapidly changing postglacial environment. They were followed by Archaic and Woodland cultures, the latter innovating a form of corn-beans-squash cultivation called “three sisters.” European colonists appeared first in small fishing and fur-trading posts and then in larger numbers at Plymouth and Massachusetts Bay. The nascent fur-trading farming, fishing, and logging economies disrupted regional ecosystems. Colonization weakened Native society through epidemics, ecological disruptions, enslavement, and wars, and yet Indigenous people persevered in family bands and small communities and sustained their identity through extended kinship ties. English husbandry shifted gradually to market production after the American Revolution, which brought further ecological disruptions. The early 19th century saw the rise of equally intrusive fishing and logging practices, which were exaggerated at century’s end by the introduction of pulp and paper production, marine engines, and new trawling equipment. New England’s Industrial Revolution began in the 1790s in the Blackstone Valley and spread from there into central New England, where more forceful rivers gave rise to gigantic textile mills. The cultural disorientation brought on by industrialization triggered the Romantic movement, epitomized by Transcendentalist discourse on the truths intuited through the contemplation of nature. The Romantic recasting of nature provided intellectual impetus for pioneering fisheries- and forest-conservation efforts. In cities, conservation brought, among other things, landscaped parks such as Boston’s Emerald Necklace. Mirroring its approach to conservation, New England pioneered several forms of environmental activism, including private land trusts, cultural landscape preservation, heritage parks, and environmental justice movements. New England “re-wilded” several of its rivers by removing dams to renew migratory fish runs.

Article

The Environment in the Atomic Age  

Rachel Rothschild

The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.

Article

Epidemics in Indian Country  

David S. Jones

Few developments in human history match the demographic consequences of the arrival of Europeans in the Americas. Between 1500 and 1900 the human populations of the Americas were traBnsformed. Countless American Indians died as Europeans established themselves, and imported Africans as slaves, in the Americas. Much of the mortality came from epidemics that swept through Indian country. The historical record is full of dramatic stories of smallpox, measles, influenza, and acute contagious diseases striking American Indian communities, causing untold suffering and facilitating European conquest. Some scholars have gone so far as to invoke the irresistible power of natural selection to explain what happened. They argue that the long isolation of Native Americans from other human populations left them uniquely susceptible to the Eurasian pathogens that accompanied European explorers and settlers; nothing could have been done to prevent the inevitable decimation of American Indians. The reality, however, is more complex. Scientists have not found convincing evidence that American Indians had a genetic susceptibility to infectious diseases. Meanwhile, it is clear that the conditions of life before and after colonization could have left Indians vulnerable to a host of diseases. Many American populations had been struggling to subsist, with declining populations, before Europeans arrived; the chaos, warfare, and demoralization that accompanied colonization made things worse. Seen from this perspective, the devastating mortality was not the result of the forces of evolution and natural selection but rather stemmed from social, economic, and political forces at work during encounter and colonization. Getting the story correct is essential. American Indians in the United States, and indigenous populations worldwide, still suffer dire health inequalities. Although smallpox is gone and many of the old infections are well controlled, new diseases have risen to prominence, especially heart disease, diabetes, cancer, substance abuse, and mental illness. The stories we tell about the history of epidemics in Indian country influence the policies we pursue to alleviate them today.

Article

Food and Agriculture in the 20th and 21st Centuries  

Gabriella M. Petrick

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article. American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles. The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods. Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.

Article

Forests and Logging in the United States  

Erik Loomis

Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.