Radiation and the Environment
Radiation and the Environment
- E. Jerry JesseeE. Jerry JesseeUniversity of Wisconsin
Summary
The “Atomic Age” has long been recognized as a signal moment in modern history. In popular memory, images of mushroom clouds from atmospheric nuclear weapons tests recall a period when militaries and highly secretive atomic energy agencies poisoned the global environment and threatened human health. Historical scholarship has painted a more complicated picture of this era by showing how nuclear technologies and radioactive releases transformed the environment sciences and helped set the stage for the scientific construction of the very idea of the “global environment.”
Radioactivity presented scientists with a double-edged sword almost as soon as scientists explained how certain unstable chemical elements emit energic particles and rays in the process of radioactive decay at the turn of the 20th century. Throughout the 1920s and 1930s, scientists hailed radioactivity as a transformative discovery that promised to transform atomic theory and biomedicine by using radioisotopes—radioactive versions of stable chemical elements—which were used to tag and trace physiological processes in living systems. At the same time, the perils of overexposure to radioactivity were becoming more apparent as researchers and industrial workers laboring in new radium-laced luminescent paint industries began suffering from radiation-induced illnesses.
The advent of a second “Atomic Age” in wake of the bombing of Japan was characterized by increased access to radiotracer technologies for science and widespread anxiety about the health effects of radioactive fallout in the environment. Powerful new atomic agencies and military institutions created new research opportunities for scientists to study the atmospheric, oceanic, and ecological pathways through which bomb test radiation could make their way to human bodies. Although these studies were driven by concerns about health effects, the presence of energy-emitting radioactivity in the environment also meant that researchers could utilize it as a tracer to visualize basic environmental processes. Throughout the 1950s and early 1960s, as a result, ecologists pioneered the use of radiotracers to investigate energy flows and the metabolism of ecosystem units. Oceanographers similarly used bomb blast radiation to trace the physical processes in oceans and the uptake of radioactivity in aquatic food chains. Meteorologists meanwhile tracked bomb debris as high as the stratosphere to predict fallout patterns and trace large-scale atmospheric phenomenon. By the early 1960s, these studies documented how radioactive fallout produced by distant nuclear tests spread across the globe and infiltrated the entire planet’s air, water, biosphere, and human bodies.
In 1963, the major nuclear powers agreed to end above-ground nuclear testing with the Limited Test Ban Treaty, the first international treaty to recognize a global environmental hazard of planetary proportions. Throughout the 1960s and into the 1980s, research on the global effects of nuclear weapons continued to shape global environmental thinking and concern as debates about nuclear winter directed professional and public attention toward humanity’s ability to alter the climate.
Keywords
Subjects
- Environment and Human Health
- Environmental History