The emergence of environment as a security imperative is something that could have been avoided. Early indications showed that if governments did not pay attention to critical environmental issues, these would move up the security agenda. As far back as the Club of Rome 1972 report, Limits to Growth, variables highlighted for policy makers included world population, industrialization, pollution, food production, and resource depletion, all of which impact how we live on this planet.
The term environmental security didn’t come into general use until the 2000s. It had its first substantive framing in 1977, with the Lester Brown Worldwatch Paper 14, “Redefining Security.” Brown argued that the traditional view of national security was based on the “assumption that the principal threat to security comes from other nations.” He went on to argue that future security “may now arise less from the relationship of nation to nation and more from the relationship between man to nature.”
Of the major documents to come out of the Earth Summit in 1992, the Rio Declaration on Environment and Development is probably the first time governments have tried to frame environmental security. Principle 2 says: “States have, in accordance with the Charter of the United Nations and the principles of international law, the sovereign right to exploit their own resources pursuant to their own environmental and developmental policies, and the responsibility to ensure that activities within their jurisdiction or control do not cause damage to the environment of other States or of areas beyond the limits of national.”
In 1994, the UN Development Program defined Human Security into distinct categories, including:
• Economic security (assured and adequate basic incomes).
• Food security (physical and affordable access to food).
• Health security.
• Environmental security (access to safe water, clean air and non-degraded land).
By the time of the World Summit on Sustainable Development, in 2002, water had begun to be identified as a security issue, first at the Rio+5 conference, and as a food security issue at the 1996 FAO Summit. In 2003, UN Secretary General Kofi Annan set up a High-Level Panel on “Threats, Challenges, and Change,” to help the UN prevent and remove threats to peace. It started to lay down new concepts on collective security, identifying six clusters for member states to consider. These included economic and social threats, such as poverty, infectious disease, and environmental degradation.
By 2007, health was being recognized as a part of the environmental security discourse, with World Health Day celebrating “International Health Security (IHS).” In particular, it looked at emerging diseases, economic stability, international crises, humanitarian emergencies, and chemical, radioactive, and biological terror threats. Environmental and climate changes have a growing impact on health. The 2007 Fourth Assessment Report (AR4) of the UN Intergovernmental Panel on Climate Change (IPCC) identified climate security as a key challenge for the 21st century. This was followed up in 2009 by the UCL-Lancet Commission on Managing the Health Effects of Climate Change—linking health and climate change.
In the run-up to Rio+20 and the launch of the Sustainable Development Goals, the issue of the climate-food-water-energy nexus, or rather, inter-linkages, between these issues was highlighted. The dialogue on environmental security has moved from a fringe discussion to being central to our political discourse—this is because of the lack of implementation of previous international agreements.
Article
The Emergence of Environment as a Security Imperative
Felix Dodds
Article
The Environmental History of the Antarctic
Sebastian Grevsmühl
The environmental history of the polar regions, and in particular of Antarctica, is a rather recent area of inquiry that is in many ways still in its infancy. As a truly multidisciplinary research field, environmental history draws much inspiration from a large diversity of fields of historical and social research, including economic history, diplomatic history, cultural history, the history of explorations, and science and technology studies. Although overarching book-length studies on the environmental history of Antarctica are still rare, historical scholars have already conducted many in-depth case studies related mostly to three major interrelated research topics: Antarctic governance, natural resource exploitation, and tourism. These recent historical efforts, carried out mostly by a new generation of historians, have thus far allowed the proposal of several powerful counternarratives, challenging the frequent yet erroneous assertion that environmental protection and conservation were completely absent from Antarctic affairs before the 1970s. In so doing, environmental historians started offering a much more complex and nuanced account of what is frequently referred to as the “greening” of Antarctica, going well beyond “declensionist” narratives and conservation success stories that commonly pervade not only environmental histories but also public discourse. Indeed, all recent historical studies agree that there is nothing inevitable about the “greening” of Antarctica, nor are conservation and environmental protection its natural destiny. Science, politics, imperialism, capitalism, and imaginaries all have played their part in this important history, a history that remains still largely to be written.
Article
Extinction
Mark V. Barrow
The prospect of extinction, the complete loss of a species or other group of organisms, has long provoked strong responses. Until the turn of the 18th century, deeply held and widely shared beliefs about the order of nature led to a firm rejection of the possibility that species could entirely vanish. During the 19th century, however, resistance to the idea of extinction gave way to widespread acceptance following the discovery of the fossil remains of numerous previously unknown forms and direct experience with contemporary human-driven decline and the destruction of several species. In an effort to stem continued loss, at the turn of the 19th century, naturalists, conservationists, and sportsmen developed arguments for preventing extinction, created wildlife conservation organizations, lobbied for early protective laws and treaties, pushed for the first government-sponsored parks and refuges, and experimented with captive breeding. In the first half of the 20th century, scientists began systematically gathering more data about the problem through global inventories of endangered species and the first life-history and ecological studies of those species.
The second half of the 20th and the beginning of the 21st centuries have been characterized both by accelerating threats to the world’s biota and greater attention to the problem of extinction. Powerful new laws, like the U.S. Endangered Species Act of 1973, have been enacted and numerous international agreements negotiated in an attempt to address the issue. Despite considerable effort, scientists remain fearful that the current rate of species loss is similar to that experienced during the five great mass extinction events identified in the fossil record, leading to declarations that the world is facing a biodiversity crisis. Responding to this crisis, often referred to as the sixth extinction, scientists have launched a new interdisciplinary, mission-oriented discipline, conservation biology, that seeks not just to understand but also to reverse biota loss. Scientists and conservationists have also developed controversial new approaches to the growing problem of extinction: rewilding, which involves establishing expansive core reserves that are connected with migratory corridors and that include populations of apex predators, and de-extinction, which uses genetic engineering techniques in a bid to resurrect lost species. Even with the development of new knowledge and new tools that seek to reverse large-scale species decline, a new and particularly imposing danger, climate change, looms on the horizon, threatening to undermine those efforts.
Article
From Plows, Horses, and Harnesses to Precision Technologies in the North American Great Plains
David E. Clay, Sharon A. Clay, Thomas DeSutter, and Cheryl Reese
Since the discovery that food security could be improved by pushing seeds into the soil and later harvesting a desirable crop, agriculture and agronomy have gone through cycles of discovery, implementation, and innovation. Discoveries have produced predicted and unpredicted impacts on the production and consumption of locally produced foods. Changes in technology, such as the development of the self-cleaning steel plow in the 18th century, provided a critical tool needed to cultivate and seed annual crops in the Great Plains of North America. However, plowing the Great Plains would not have been possible without the domestication of plants and animals and the discovery of the yoke and harness. Associated with plowing the prairies were extensive soil nutrient mining, a rapid loss of soil carbon, and increased wind and water erosion. More recently, the development of genetically modified organisms (GMOs) and no-tillage planters has contributed to increased adoption of conservation tillage, which is less damaging to the soil. In the future, the ultimate impact of climate change on agronomic practices in the North American Great Plains is unknown. However, projected increasing temperatures and decreased rainfall in the southern Great Plains (SGP) will likely reduce agricultural productivity. Different results are likely in the northern Great Plains (NGP) where higher temperatures can lead to increased agricultural intensification, the conversion of grassland to cropland, increased wildlife fragmentation, and increased soil erosion. Precision farming, conservation, cover crops, and the creation of plants better designed to their local environment can help mitigate these effects. However, changing practices require that farmers and their advisers understand the limitations of the soils, plants, and environment, and their production systems. Failure to implement appropriate management practices can result in a rapid decline in soil productivity, diminished water quality, and reduced wildlife habitat.
Article
Quaternary Science
Kenneth Addison
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Environmental Science. Please check back later for the full article.
The Quaternary period of Earth history, which commenced ca. 2.6 Ma ago, is noted for a series of dramatic shifts in global climate between long, cool (“icehouse”) and short, temperate (“greenhouse”) stages. This also coincides with the extinction of later Australopithecine hominins and evolution of modern Homo sapiens.
Wide recognition of a fourth, Quaternary, order of geologic time emerged in Europe between ca. 1760–1830 and became closely identified with the concept of an ice age. This most recent episode in Earth history is also the best preserved in stratigraphic and landscape records. Indeed, much of its character and processes continue in present time, which prompted early geologists’ recognition of the concept of uniformitarianism—the present is the key to the past.
Quaternary time was quickly divided into a dominant Pleistocene (“most recent”) epoch, characterized by cyclical growth and decay of major continental ice sheets and peripheral permafrost. Disappearance of most of these ice sheets, except in Antarctica and Greenland today, ushered in the Holocene (“wholly modern”) epoch, once thought to terminate the Ice Age but now seen as the current interglacial or temperate stage, commencing ca. 11.7 ka ago. Covering 30–50% of Earth’s land surface at their maxima, ice sheets and permafrost squeezed remaining biomes into a narrower circum-equatorial zone, where research indicated the former occurrence of pluvial and desiccation events. Early efforts to correlate them with mid-high latitude glacials and interglacials revealed the complex and often asynchronous Pleistocene record.
Nineteenth-century recognition of just four glaciations reflected a reliance on geomorphology and short terrestrial stratigraphic records, concentrated in northern hemisphere mid- and high-latitudes, until the 1970s. Correlation of δ16-18 O isotope signals from seafloor sediments (from ocean drilling programs after the 1960s) with polar ice core signals from the 1980s onward has revolutionized our understanding of the Quaternary, facilitating a sophisticated, time-constrained record of events and environmental reconstructions from regional to global scales. Records from oceans and ice sheets, some spanning 105–106 years, are augmented by similar long records from loess, lake sediments, and speleothems (cave sediments). Their collective value is enhanced by innovative analytical and dating tools.
Over 100 Marine Isotope Stages (MIS) are now recognized in the Quaternary, with dramatic climate shifts at decadal and centennial timescales—with the magnitude of 22 MIS in the past 900,000 years considered to reflect significant ice sheet accumulation and decay. Each cycle between temperate and cool conditions (odd- and even-numbered MIS respectively) is time-asymmetric, with progressive cooling over 80,000 to 100,000 years, followed by an abrupt termination then rapid return to temperate conditions for a few thousand years.
The search for causes of Quaternary climate and environmental change embraces all strands of Earth System Science. Strong correlation between orbital forcing and major climate changes (summarized as the Milankovitch mechanism) is displacing earlier emphasis on radiative (direct solar) forcing, but uncertainty remains over how the orbital signal is amplified or modulated. Tectonic forcing (ocean-continent distributions, tectonic uplift, and volcanic outgassing), atmosphere-biogeochemical and greenhouse gas exchange, ocean-land surface albedo and deep- and surface-ocean circulation are all contenders and important agents in their own right.
Modern understanding of Quaternary environments and processes feeds an exponential growth of multidisciplinary research, numerical modeling, and applications. Climate modeling exploits mutual benefits to science and society of “hindcasting,” using paleoclimate data to aid understanding of the past and increasing confidence in modeling forecasts. Pursuit of more detailed and sophisticated understanding of ocean-atmosphere-cryosphere-biosphere interaction proceeds apace.
The Quaternary is also the stage on which human evolution plays. And the essential distinction between natural climate variability and human forcing is now recognized as designating, in present time, a potential new Anthropocene epoch. Quaternary past and present are major keys to its future.