Managing water in cities presents a series of intersecting challenges. Rapid urbanization, wasteful consumption, minimal efforts at urban or ecological planning, and especially climate change have made management of urban water more difficult. Urban water management is multifaceted and interconnected: cities must at once address problems of too much water (i.e., more frequent and extreme weather events, increased riverine and coastal flooding, and rising sea levels), but also not enough water (e.g., drought and water scarcity), as well as the need to protect the quality of water and water bodies.
This article presents a comprehensive and holistic picture of water planning challenges facing cities, and the historical approaches and newer methods embraced by cities with special attention to the need to consider the special effects of climate change on these multiple aspects of water and the role of ecological planning and design in responding to them. Ecological planning represents the best and most effective approach to urban water management, and ecological planning approaches hold the most promise for achieving the best overall outcomes in cities when taking into account multiple benefits (e.g., minimizing natural hazards, securing a sustainable water supply) as well as the need to protect and restore the natural environment. There are many opportunities to build on to the history of ecological planning, and ecological planning for water is growing in importance and momentum. Ecological planning for water provides the chance to profoundly rethink and readjust mankind’s relationship to water and provides the chance also to reimagine and reshape cities of the 21st century.
Article
Ecological Water Management in Cities
Timothy Beatley
Article
The Emergence of Environment as a Security Imperative
Felix Dodds
The emergence of environment as a security imperative is something that could have been avoided. Early indications showed that if governments did not pay attention to critical environmental issues, these would move up the security agenda. As far back as the Club of Rome 1972 report, Limits to Growth, variables highlighted for policy makers included world population, industrialization, pollution, food production, and resource depletion, all of which impact how we live on this planet.
The term environmental security didn’t come into general use until the 2000s. It had its first substantive framing in 1977, with the Lester Brown Worldwatch Paper 14, “Redefining Security.” Brown argued that the traditional view of national security was based on the “assumption that the principal threat to security comes from other nations.” He went on to argue that future security “may now arise less from the relationship of nation to nation and more from the relationship between man to nature.”
Of the major documents to come out of the Earth Summit in 1992, the Rio Declaration on Environment and Development is probably the first time governments have tried to frame environmental security. Principle 2 says: “States have, in accordance with the Charter of the United Nations and the principles of international law, the sovereign right to exploit their own resources pursuant to their own environmental and developmental policies, and the responsibility to ensure that activities within their jurisdiction or control do not cause damage to the environment of other States or of areas beyond the limits of national.”
In 1994, the UN Development Program defined Human Security into distinct categories, including:
• Economic security (assured and adequate basic incomes).
• Food security (physical and affordable access to food).
• Health security.
• Environmental security (access to safe water, clean air and non-degraded land).
By the time of the World Summit on Sustainable Development, in 2002, water had begun to be identified as a security issue, first at the Rio+5 conference, and as a food security issue at the 1996 FAO Summit. In 2003, UN Secretary General Kofi Annan set up a High-Level Panel on “Threats, Challenges, and Change,” to help the UN prevent and remove threats to peace. It started to lay down new concepts on collective security, identifying six clusters for member states to consider. These included economic and social threats, such as poverty, infectious disease, and environmental degradation.
By 2007, health was being recognized as a part of the environmental security discourse, with World Health Day celebrating “International Health Security (IHS).” In particular, it looked at emerging diseases, economic stability, international crises, humanitarian emergencies, and chemical, radioactive, and biological terror threats. Environmental and climate changes have a growing impact on health. The 2007 Fourth Assessment Report (AR4) of the UN Intergovernmental Panel on Climate Change (IPCC) identified climate security as a key challenge for the 21st century. This was followed up in 2009 by the UCL-Lancet Commission on Managing the Health Effects of Climate Change—linking health and climate change.
In the run-up to Rio+20 and the launch of the Sustainable Development Goals, the issue of the climate-food-water-energy nexus, or rather, inter-linkages, between these issues was highlighted. The dialogue on environmental security has moved from a fringe discussion to being central to our political discourse—this is because of the lack of implementation of previous international agreements.
Article
Radiation and the Environment
E. Jerry Jessee
The “Atomic Age” has long been recognized as a signal moment in modern history. In popular memory, images of mushroom clouds from atmospheric nuclear weapons tests recall a period when militaries and highly secretive atomic energy agencies poisoned the global environment and threatened human health. Historical scholarship has painted a more complicated picture of this era by showing how nuclear technologies and radioactive releases transformed the environment sciences and helped set the stage for the scientific construction of the very idea of the “global environment.”
Radioactivity presented scientists with a double-edged sword almost as soon as scientists explained how certain unstable chemical elements emit energic particles and rays in the process of radioactive decay at the turn of the 20th century. Throughout the 1920s and 1930s, scientists hailed radioactivity as a transformative discovery that promised to transform atomic theory and biomedicine by using radioisotopes—radioactive versions of stable chemical elements—which were used to tag and trace physiological processes in living systems. At the same time, the perils of overexposure to radioactivity were becoming more apparent as researchers and industrial workers laboring in new radium-laced luminescent paint industries began suffering from radiation-induced illnesses.
The advent of a second “Atomic Age” in wake of the bombing of Japan was characterized by increased access to radiotracer technologies for science and widespread anxiety about the health effects of radioactive fallout in the environment. Powerful new atomic agencies and military institutions created new research opportunities for scientists to study the atmospheric, oceanic, and ecological pathways through which bomb test radiation could make their way to human bodies. Although these studies were driven by concerns about health effects, the presence of energy-emitting radioactivity in the environment also meant that researchers could utilize it as a tracer to visualize basic environmental processes. Throughout the 1950s and early 1960s, as a result, ecologists pioneered the use of radiotracers to investigate energy flows and the metabolism of ecosystem units. Oceanographers similarly used bomb blast radiation to trace the physical processes in oceans and the uptake of radioactivity in aquatic food chains. Meteorologists meanwhile tracked bomb debris as high as the stratosphere to predict fallout patterns and trace large-scale atmospheric phenomenon. By the early 1960s, these studies documented how radioactive fallout produced by distant nuclear tests spread across the globe and infiltrated the entire planet’s air, water, biosphere, and human bodies.
In 1963, the major nuclear powers agreed to end above-ground nuclear testing with the Limited Test Ban Treaty, the first international treaty to recognize a global environmental hazard of planetary proportions. Throughout the 1960s and into the 1980s, research on the global effects of nuclear weapons continued to shape global environmental thinking and concern as debates about nuclear winter directed professional and public attention toward humanity’s ability to alter the climate.