1-7 of 7 Results  for:

  • History of Science and Technology x
  • Environmental History x
Clear all

Article

Chemical and Biological Weapons Policy  

Thomas I. Faith

Chemical and biological weapons represent two distinct types of munitions that share some common policy implications. While chemical weapons and biological weapons are different in terms of their development, manufacture, use, and the methods necessary to defend against them, they are commonly united in matters of policy as “weapons of mass destruction,” along with nuclear and radiological weapons. Both chemical and biological weapons have the potential to cause mass casualties, require some technical expertise to produce, and can be employed effectively by both nation states and non-state actors. U.S. policies in the early 20th century were informed by preexisting taboos against poison weapons and the American Expeditionary Forces’ experiences during World War I. The United States promoted restrictions in the use of chemical and biological weapons through World War II, but increased research and development work at the outset of the Cold War. In response to domestic and international pressures during the Vietnam War, the United States drastically curtailed its chemical and biological weapons programs and began supporting international arms control efforts such as the Biological and Toxin Weapons Convention and the Chemical Weapons Convention. U.S. chemical and biological weapons policies significantly influence U.S. policies in the Middle East and the fight against terrorism.

Article

Civilian Nuclear Power  

Daniel Pope

Nuclear power in the United States has had an uneven history and faces an uncertain future. Promising in the 1950s electricity “too cheap to meter,” nuclear power has failed to come close to that goal, although it has carved out approximately a 20 percent share of American electrical output. Two decades after World War II, General Electric and Westinghouse offered electric utilities completed “turnkey” plants at a fixed cost, hoping these “loss leaders” would create a demand for further projects. During the 1970s the industry boomed, but it also brought forth a large-scale protest movement. Since then, partly because of that movement and because of the drama of the 1979 Three Mile Island accident, nuclear power has plateaued, with only one reactor completed since 1995. Several factors account for the failed promise of nuclear energy. Civilian power has never fully shaken its military ancestry or its connotations of weaponry and warfare. American reactor designs borrowed from nuclear submarines. Concerns about weapons proliferation stymied industry hopes for breeder reactors that would produce plutonium as a byproduct. Federal regulatory agencies dealing with civilian nuclear energy also have military roles. Those connections have provided some advantages to the industry, but they have also generated fears. Not surprisingly, the “anti-nukes” movement of the 1970s and 1980s was closely bound to movements for peace and disarmament. The industry’s disappointments must also be understood in a wider energy context. Nuclear grew rapidly in the late 1960s and 1970s as domestic petroleum output shrank and environmental objections to coal came to the fore. At the same time, however, slowing economic growth and an emphasis on energy efficiency reduced demand for new power output. In the 21st century, new reactor designs and the perils of fossil-fuel-caused global warming have once again raised hopes for nuclear, but natural gas and renewables now compete favorably against new nuclear projects. Economic factors have been the main reason that nuclear has stalled in the last forty years. Highly capital intensive, nuclear projects have all too often taken too long to build and cost far more than initially forecast. The lack of standard plant designs, the need for expensive safety and security measures, and the inherent complexity of nuclear technology have all contributed to nuclear power’s inability to make its case on cost persuasively. Nevertheless, nuclear power may survive and even thrive if the nation commits to curtailing fossil fuel use or if, as the Trump administration proposes, it opts for subsidies to keep reactors operating.

Article

The Environment in the Atomic Age  

Rachel Rothschild

The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.

Article

Technology and the Environment  

Timothy James LeCain

Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature. Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.

Article

The American Antinuclear Movement  

Paul Rubinson

Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.

Article

Infrastructure: Water and Sewers  

Joel A. Tarr

Urban water supply and sewage disposal facilities are critical parts of the urban infrastructure. They have enabled cities and their metropolitan areas to function as centers of commerce, industry, entertainment, and human habitation. The evolution of water supply and sewage disposal systems in American cities from 1800 to 2015 is examined, with a focus on major turning points especially in regard to technological decisions, public policy, and environmental and public health issues.

Article

Energy in American History  

Aaron Sachs

Energy systems have played a significant role in U.S. history; some scholars claim that they have determined a number of other developments. From the colonial period to the present, Americans have shifted from depending largely on wood and their own bodies, as well as the labor of draft animals; to harnessing water power; to building steam engines; to extracting fossil fuels—first coal and then oil; to distributing electrical power through a grid. Each shift has been accompanied by a number of other striking changes, especially in the modern period associated with fossil fuels. By the late 19th century, in part thanks to new energy systems, Americans were embracing industrialization, urbanization, consumerism, and, in a common contemporary phrase, “the annihilation of space and time.” Today, in the era of climate change, the focus tends to be on the production or supply side of energy systems, but a historical perspective reminds us to consider the consumption or demand side as well. Just as important as the striking of oil in Beaumont, Texas, in 1901, was the development of new assumptions about how much energy people needed to sustain their lives and how much work they could be expected to do. Clearly, Americans are still grappling with the question of whether their society’s heavy investment in coal- and petroleum-based energy systems has been worthwhile.