241-260 of 328 Results

Article

Prehistoric and Traditional Agriculture in Lowland Mesoamerica  

Clarissa Cagnato

Mesoamerica is one of the world’s primary centers of domestication where agriculture arose independently. Paleoethnobotany (or archaeobotany), along with archaeology, epigraphy, and ethnohistorical and ethnobotanical data, provide increasingly important insights into the ancient agriculture of Lowland Mesoamerica (below 1000 m above sea level). Moreover, new advances in the analysis of microbotanical remains in the form of pollen, phytoliths, and starch-grain analysis and chemical analysis of organic residues have further contributed to our understanding of ancient plant use in this region. Prehistoric and traditional agriculture in the lowlands of Mesoamerica—notably the Maya lowlands, the Gulf Coast, and the Pacific Coast of southern Chiapas (Mexico) and Guatemala—from the Archaic (ca. 8000/7000–2000 bc) through the Preclassic/Formative (2000 bc–ad 250) and into the Classic (ad 250–900) period, are covered. During the late Archaic, these lowland regions were inhabited by people who took full advantage of the rich natural biodiversity but also grew domesticates before becoming fully sedentary. Through time, they developed diverse management strategies to produce food, from the forest management system (which includes swidden agriculture), to larger scale land modifications such as terraces, and continued to rely on semidomesticated and wild plant resources. Although lowland populations came to eventually rely on maize as a staple, other resources such as root crops and fruit trees were also cultivated, encouraged, and consumed. The need for additional research that includes systematic collection of paleoethnobotanical data, along with other lines of evidence, will be key to continue refining the understanding of ancient subsistence systems and how these changed through time and across lowland Mesoamerica.

Article

Pros and Cons of GMO Crop Farming  

Rene Van Acker, M. Motior Rahman, and S. Zahra H. Cici

The global area sown to genetically modified (GM) varieties of leading commercial crops (soybean, maize, canola, and cotton) has expanded over 100-fold over two decades. Thirty countries are producing GM crops and just five countries (United States, Brazil, Argentina, Canada, and India) account for almost 90% of the GM production. Only four crops account for 99% of worldwide GM crop area. Almost 100% of GM crops on the market are genetically engineered with herbicide tolerance (HT), and insect resistance (IR) traits. Approximately 70% of cultivated GM crops are HT, and GM HT crops have been credited with facilitating no-tillage and conservation tillage practices that conserve soil moisture and control soil erosion, and that also support carbon sequestration and reduced greenhouse gas emissions. Crop production and productivity increased significantly during the era of the adoption of GM crops; some of this increase can be attributed to GM technology and the yield protection traits that it has made possible even if the GM traits implemented to-date are not yield traits per se. GM crops have also been credited with helping to improve farm incomes and reduce pesticide use. Practical concerns around GM crops include the rise of insect pests and weeds that are resistant to pesticides. Other concerns around GM crops include broad seed variety access for farmers and rising seed costs as well as increased dependency on multinational seed companies. Citizens in many countries and especially in European countries are opposed to GM crops and have voiced concerns about possible impacts on human and environmental health. Nonetheless, proponents of GM crops argue that they are needed to enhance worldwide food production. The novelty of the technology and its potential to bring almost any trait into crops mean that there needs to remain dedicated diligence on the part of regulators to ensure that no GM crops are deregulated that may in fact pose risks to human health or the environment. The same will be true for the next wave of new breeding technologies, which include gene editing technologies.

Article

Quaternary Science  

Kenneth Addison

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Environmental Science. Please check back later for the full article. The Quaternary period of Earth history, which commenced ca. 2.6 Ma ago, is noted for a series of dramatic shifts in global climate between long, cool (“icehouse”) and short, temperate (“greenhouse”) stages. This also coincides with the extinction of later Australopithecine hominins and evolution of modern Homo sapiens. Wide recognition of a fourth, Quaternary, order of geologic time emerged in Europe between ca. 1760–1830 and became closely identified with the concept of an ice age. This most recent episode in Earth history is also the best preserved in stratigraphic and landscape records. Indeed, much of its character and processes continue in present time, which prompted early geologists’ recognition of the concept of uniformitarianism—the present is the key to the past. Quaternary time was quickly divided into a dominant Pleistocene (“most recent”) epoch, characterized by cyclical growth and decay of major continental ice sheets and peripheral permafrost. Disappearance of most of these ice sheets, except in Antarctica and Greenland today, ushered in the Holocene (“wholly modern”) epoch, once thought to terminate the Ice Age but now seen as the current interglacial or temperate stage, commencing ca. 11.7 ka ago. Covering 30–50% of Earth’s land surface at their maxima, ice sheets and permafrost squeezed remaining biomes into a narrower circum-equatorial zone, where research indicated the former occurrence of pluvial and desiccation events. Early efforts to correlate them with mid-high latitude glacials and interglacials revealed the complex and often asynchronous Pleistocene record. Nineteenth-century recognition of just four glaciations reflected a reliance on geomorphology and short terrestrial stratigraphic records, concentrated in northern hemisphere mid- and high-latitudes, until the 1970s. Correlation of δ16-18 O isotope signals from seafloor sediments (from ocean drilling programs after the 1960s) with polar ice core signals from the 1980s onward has revolutionized our understanding of the Quaternary, facilitating a sophisticated, time-constrained record of events and environmental reconstructions from regional to global scales. Records from oceans and ice sheets, some spanning 105–106 years, are augmented by similar long records from loess, lake sediments, and speleothems (cave sediments). Their collective value is enhanced by innovative analytical and dating tools. Over 100 Marine Isotope Stages (MIS) are now recognized in the Quaternary, with dramatic climate shifts at decadal and centennial timescales—with the magnitude of 22 MIS in the past 900,000 years considered to reflect significant ice sheet accumulation and decay. Each cycle between temperate and cool conditions (odd- and even-numbered MIS respectively) is time-asymmetric, with progressive cooling over 80,000 to 100,000 years, followed by an abrupt termination then rapid return to temperate conditions for a few thousand years. The search for causes of Quaternary climate and environmental change embraces all strands of Earth System Science. Strong correlation between orbital forcing and major climate changes (summarized as the Milankovitch mechanism) is displacing earlier emphasis on radiative (direct solar) forcing, but uncertainty remains over how the orbital signal is amplified or modulated. Tectonic forcing (ocean-continent distributions, tectonic uplift, and volcanic outgassing), atmosphere-biogeochemical and greenhouse gas exchange, ocean-land surface albedo and deep- and surface-ocean circulation are all contenders and important agents in their own right. Modern understanding of Quaternary environments and processes feeds an exponential growth of multidisciplinary research, numerical modeling, and applications. Climate modeling exploits mutual benefits to science and society of “hindcasting,” using paleoclimate data to aid understanding of the past and increasing confidence in modeling forecasts. Pursuit of more detailed and sophisticated understanding of ocean-atmosphere-cryosphere-biosphere interaction proceeds apace. The Quaternary is also the stage on which human evolution plays. And the essential distinction between natural climate variability and human forcing is now recognized as designating, in present time, a potential new Anthropocene epoch. Quaternary past and present are major keys to its future.

Article

Radiation and the Environment  

E. Jerry Jessee

The “Atomic Age” has long been recognized as a signal moment in modern history. In popular memory, images of mushroom clouds from atmospheric nuclear weapons tests recall a period when militaries and highly secretive atomic energy agencies poisoned the global environment and threatened human health. Historical scholarship has painted a more complicated picture of this era by showing how nuclear technologies and radioactive releases transformed the environment sciences and helped set the stage for the scientific construction of the very idea of the “global environment.” Radioactivity presented scientists with a double-edged sword almost as soon as scientists explained how certain unstable chemical elements emit energic particles and rays in the process of radioactive decay at the turn of the 20th century. Throughout the 1920s and 1930s, scientists hailed radioactivity as a transformative discovery that promised to transform atomic theory and biomedicine by using radioisotopes—radioactive versions of stable chemical elements—which were used to tag and trace physiological processes in living systems. At the same time, the perils of overexposure to radioactivity were becoming more apparent as researchers and industrial workers laboring in new radium-laced luminescent paint industries began suffering from radiation-induced illnesses. The advent of a second “Atomic Age” in wake of the bombing of Japan was characterized by increased access to radiotracer technologies for science and widespread anxiety about the health effects of radioactive fallout in the environment. Powerful new atomic agencies and military institutions created new research opportunities for scientists to study the atmospheric, oceanic, and ecological pathways through which bomb test radiation could make their way to human bodies. Although these studies were driven by concerns about health effects, the presence of energy-emitting radioactivity in the environment also meant that researchers could utilize it as a tracer to visualize basic environmental processes. Throughout the 1950s and early 1960s, as a result, ecologists pioneered the use of radiotracers to investigate energy flows and the metabolism of ecosystem units. Oceanographers similarly used bomb blast radiation to trace the physical processes in oceans and the uptake of radioactivity in aquatic food chains. Meteorologists meanwhile tracked bomb debris as high as the stratosphere to predict fallout patterns and trace large-scale atmospheric phenomenon. By the early 1960s, these studies documented how radioactive fallout produced by distant nuclear tests spread across the globe and infiltrated the entire planet’s air, water, biosphere, and human bodies. In 1963, the major nuclear powers agreed to end above-ground nuclear testing with the Limited Test Ban Treaty, the first international treaty to recognize a global environmental hazard of planetary proportions. Throughout the 1960s and into the 1980s, research on the global effects of nuclear weapons continued to shape global environmental thinking and concern as debates about nuclear winter directed professional and public attention toward humanity’s ability to alter the climate.

Article

Recreation Use Values for Water-Based Recreation  

John Loomis and Lucas Bair

Outdoor recreation is an important and growing activity worldwide. Water-based outdoor recreation is a subset that includes various activities such as fishing, boating, and swimming. While a large portion of water-based recreation is either free or provided at administratively set minimal entrance fees, these activities still involve significant economic value in aggregate. Because many water-based recreation activities do not have market prices, economists have developed nonmarket valuation methods to estimate the full scope of economic values to participants associated with these activities. Estimates of the economic value of water-based recreation are important in water resource management. While water resource infrastructure investment decisions typically include the economic value of recreation, periodic evaluation of infrastructure operations after construction may not. Re-evaluation of operations is particularly important if rapid changes in future conditions such as drought or changes in recreational demand occur. Because developing original site-based estimates of economic value requires significant effort, it is important to understand the general economic value of specific water-based recreational activities and methods used to transfer benefit estimates from existing studies to other sites.

Article

Renewable Energy for Human Sustainability  

Peter J. Schubert

Renewable energy was used exclusively by the first humans and is likely to be the predominant source for future humans. Between these times the use of extracted resources such as coal, oil, and natural gas has created an explosion of population and affluence, but also of pollution and dependency. This article explores the advent of energy sources in a broad social context including economics, finance, and policy. The means of producing renewable energy are described in an accessible way, highlighting the broad range of considerations in their development, deployment, and ability to scale to address the entirety of human enterprises.

Article

Resilience  

Carl Folke

Resilience thinking in relation to the environment has emerged as a lens of inquiry that serves a platform for interdisciplinary dialogue and collaboration. Resilience is about cultivating the capacity to sustain development in the face of expected and surprising change and diverse pathways of development and potential thresholds between them. The evolution of resilience thinking is coupled to social-ecological systems and a truly intertwined human-environment planet. Resilience as persistence, adaptability and, transformability of complex adaptive social-ecological systems is the focus, clarifying the dynamic and forward-looking nature of the concept. Resilience thinking emphasizes that social-ecological systems, from the individual, to community, to society as a whole, are embedded in the biosphere. The biosphere connection is an essential observation if sustainability is to be taken seriously. In the continuous advancement of resilience thinking there are efforts aimed at capturing resilience of social-ecological systems and finding ways for people and institutions to govern social-ecological dynamics for improved human well-being, at the local, across levels and scales, to the global. Consequently, in resilience thinking, development issues for human well-being, for people and planet, are framed in a context of understanding and governing complex social-ecological dynamics for sustainability as part of a dynamic biosphere.

Article

Rethinking Conflict over Water  

Scott M. Moore

It has long been accepted that non-renewable natural resources like oil and gas are often the subject of conflict between both nation-states and social groups. But since the end of the Cold War, the idea that renewable resources like water and timber might also be a cause of conflict has steadily gained credence. This is particularly true in the case of water: in the early 1990s, a senior World Bank official famously predicted that “the wars of the next century will be fought over water,” while two years ago Indian strategist Brahma Chellaney made a splash in North America by claiming that water would be “Asia’s New Battleground.” But it has not quite turned out that way. The world has, so far, avoided inter-state conflict over water in the 21st century, but it has witnessed many localized conflicts, some involving considerable violence. As population growth, economic development, and climate change place growing strains on the world’s fresh water supplies, the relationship between resource scarcity, institutions, and conflict has become a topic of vocal debate among social and environmental scientists. The idea that water scarcity leads to conflict is rooted in three common assertions. The first of these arguments is that, around the world, once-plentiful renewable resources like fresh water, timber, and even soils are under increasing pressure, and are therefore likely to stoke conflict among increasing numbers of people who seek to utilize dwindling supplies. A second, and often corollary, argument holds that water’s unique value to human life and well-being—namely that there are no substitutes for water, as there are for most other critical natural resources—makes it uniquely conductive to conflict. Finally, a third presumption behind the water wars hypothesis stems from the fact that many water bodies, and nearly all large river basins, are shared between multiple countries. When an upstream country can harm its downstream neighbor by diverting or controlling flows of water, the argument goes, conflict is likely to ensue. But each of these assertions depends on making assumptions about how people react to water scarcity, the means they have at their disposal to adapt to it, and the circumstances under which they are apt to cooperate rather than to engage in conflict. Untangling these complex relationships promises a more refined understanding of whether and how water scarcity might lead to conflict in the 21st century—and how cooperation can be encouraged instead.

Article

Rethinking Hydropower: The Economics and Politics of Privately Owned Hydropower in the United States  

Lynne Y. Lewis

2019 marked the 20th anniversary of the removal of the Edwards Dam in Augusta, Maine (USA). Edwards Dam was the first federally licensed hydropower dam to be denied relicensing, and the dam was removed for the purpose of restoring the 10 anadromous fish species that use the Kennebec River. Since that time, numerous other small dams have been removed in the United States. The relicensing process considers benefit-cost analysis, yet remains fundamentally flawed in the consideration of the benefits of dam removals and fish passage. Successful dam removals rely (mostly) on local efforts and outside analysis.

Article

Rethinking Water Markets  

Rupert Quentin Grafton, James Horne, and Sarah A. Wheeler

Global water extractions from streams, rivers, lakes, and aquifers are continuously increasing, yet some four billion people already face severe water scarcity for at least one month per year. Deteriorating water security will, in the absence in how water is governed, get worse with climate change, as modeling projections indicate that much of the world’s arid and semiarid locations will receive less rainfall into the future. Concomitant with climate change is a growing world population, expected to be about 10 billion by 2050, that will greatly increase the global food demand, but this demand cannot be met without increased food production that depends on an adequate supply of water for agriculture. This poses a global challenge: How to ensure immediate and priority needs (such as safe drinking water) are satisfied without compromising future water security and the long-term sustainability of freshwater ecosystems? An effective and sustainable response must resolve the “who gets what water and when” water allocation problem and promote water justice. Many decision makers, however, act as if gross inequities in water access can be managed by “business as usual” and upgrades in water infrastructure alone. But much more is needed if the world is to achieve its Sustainable Development Goal of “water and sanitation for all” by 2030. Transformational change is required such that the price paid for water by users includes the economic costs of supply and use and the multiple values of water. Water markets in relation to physical volumes of water offer one approach, among others, that can potentially deliver transformational change by: (a) providing economic incentives to promote water conservation and (b) allowing water to be voluntarily transferred among competing users and uses (including non-uses for the environment and uses that support cultural values) to increase the total economic value from water. Realizing the full potential of water markets, however, is a challenge, and formal water markets require adequate regulatory oversight. Such oversight, at a minimum, must ensure: (a) the metering, monitoring, and compliance of water users and catchment-scale water auditing; (b) active compliance to protect both buyers and sellers from market manipulations; and (c) a judiciary system that supports the regulatory rules and punishes noncompliance. In many countries, the institutional and water governance framework is not yet sufficiently developed for water markets. In some countries, such as Australia, China, Spain, and the United States, the conditions do exist for successful water markets, but ongoing improvements are still needed as circumstances change in relation to water users and uses, institutions, and the environment. Importantly, into the future, water markets must be designed and redesigned to promote both water security and water justice. Without a paradigm shift in how water is governed, and that includes rethinking water markets to support efficiency and equitable access, billions of people will face increasing risks to their livelihoods and lives and many fresh-water environments will face the risk of catastrophic decline.

Article

Review of Rain and Atmospheric Water Harvesting History and Technology  

Nathan Ortiz and Sameer Rao

Water is an essential resource and is under increased strain year after year. Fresh water can be a difficult resource to come by, but the solution may lie in the invisible water source that surrounds us. The atmosphere contains 12.9 trillion m3 of fresh water in liquid and vapor forms. Rain and fog harvesting were the first solutions developed in ancient times, taking advantage of water that already existed in a liquid state. These technologies do not require energy input to overcome the enthalpy of condensation and thus are passive in nature. They are, however, limited to climates and regions that experience regular rainfall or 100% relative humidity (RH) for rainwater and fog harvesting, respectively. People living in areas outside of the usable range needed to look deeper for a solution. With the advent of refrigeration in the 20th century, techniques came that enabled access to the more elusive water vapor (i.e., <100% RH) that exists in the atmosphere. Refrigeration based dewing (RBD) is the most common technique of collecting water vapor from the atmosphere and was first developed in the 1930s but found greater adoption in the 1980s. RBD is the process of cooling ambient air to the dew point temperature. At this temperature water vapor in the atmosphere will begin to condense, forming liquid droplets. As the humidity ratio, or amount of water in a given quantity of air (gwater/kgdry-air) continues to decrease, RBD becomes infeasible. Below a threshold of about 3.5 gwater/kgdry-air the dewpoint temperature is below the freezing point and ice is formed during condensation in place of liquid water. Since the turn of the century, many researchers have made significant progress in developing a new wave of water harvesters capable of operating in much more arid climates than previously accessible with RBD. At lower humidity ratios more effort must be expended to produce the same amount of liquid water. Membrane and sorbent-based systems can be designed as passive or active; both aim to gather a high concentration of water vapor from the ambient, creating local regions of increased relative humidity. Sorbent-based systems utilize the intrinsic hydrophilicity of solid and liquid desiccants to capture and store water vapor from the atmosphere in either their pore structure (adsorbents) or in solution (absorbents). Membrane separators utilize a semipermeable membrane that allows water vapor to pass through but blocks the free passage of air, creating a region of much higher relative humidity than the environment. Technologies that concentrate water vapor must utilize an additional condensation step to produce liquid water. The advantage gained by these advancements is their ability to provide access to clean water for even the most arid climates around the globe, where the need for secure water is the greatest. Increased demand for water has led scientists and engineers to develop novel materials and climb the energy ladder, overcoming the energy requirements of atmospheric water harvesting. Many research groups around the world are working quickly to develop new technologies and more efficient water harvesters.

Article

Review of the State of the Art in Analysis of the Economics of Water Resources Infrastructure  

Marc Jeuland

Water resources represent an essential input to most human activities, but harnessing them requires significant infrastructure. Such water control allows populations to cope with stochastic water availability, preserving uses during droughts while protecting against the ravages of floods. Economic analysis is particularly valuable for helping to guide infrastructure investment choices, and for comparing the relative value of so called hard and soft (noninfrastructure) approaches to water management. The historical evolution of the tools for conducting such economic analysis is considered. Given the multimillennial history of human reliance on water infrastructure, it may be surprising that economic assessments of its value are a relatively recent development. Owing to the need to justify the rapid deployment of major public-sector financing outlays for water infrastructure in the early 20th century, government agencies in the United States—the Army Corps of Engineers and the Bureau of Reclamation—were early pioneers in developing these applications. Their work faced numerous technical challenges, first addressed in the drafting of the cost-benefit norms of the “Green Book.” Subsequent methodological innovation then worked to address a suite of challenges related to nonmarket uses of water, stochastic hydrology, water systems interdependencies, the social opportunity cost of capital, and impacts on secondary markets, as well as endogenous sociocultural feedbacks. The improved methods that have emerged have now been applied extensively around the world, with applications increasingly focused on the Global South where the best infrastructure development opportunities remain today. The dominant tools for carrying out such economic analyses are simulation or optimization hydroeconomic models (HEM), but there are also other options: economy wide water-economy models (WEMs), sociohydrological models (SHMs), spreadsheet-based partial equilibrium cost-benefit models, and others. Each of these has different strengths and weaknesses. Notable innovations are also discussed. For HEMs, these include stochastic, fuzz, and robust optimization, respectively, as well as co-integration with models of other sectors (e.g., energy systems models). Recent cutting-edge work with WEMs and spreadsheet-based CBA models, meanwhile, has focused on linking these tools with spatially resolved HEMs. SHMs have only seen limited application to infrastructure valuation problems but have been useful for illuminating the paradox of flood management infrastructure increasing the incidence and severity of flood damages, and for explaining the co-evolution of water-based development and environmental concerns, which ironically then devalues the original infrastructure. Other notable innovations are apparent in multicriteria decision analysis, and in game-theoretic modeling of noncooperative water institutions. These advances notwithstanding, several issues continue to challenge accurate and helpful economic appraisal of water infrastructure and should be the subject of future investigations in this domain. These include better assessment of environmental and distributional impacts, incorporation of empirically based representations of costs and benefits, and greater attention to the opportunity costs of infrastructure. Existing tools are well evolved from those of a few decades ago, supported by enhancements in scientific understanding and computational power. Yet, they do appear to systematically produce inflated estimations of the net benefits of water infrastructure. Tackling existing shortcomings will require continued interdisciplinary collaboration between economists and scholars from other disciplines, to allow leveraging of new theoretical insights, empirical data analyses, and modeling innovations.

Article

Rewilding  

Jozef Keulartz

Rewilding aims at maintaining or even increasing biodiversity through the restoration of ecological and evolutionary processes using extant keystone species or ecological replacements of extinct keystone species that drive these processes. It is hailed by some as the most exciting and promising conservation strategy to slow down or stop what is considered to be the greatest mass extinction of species since the extinction of the dinosaurs 65 million years ago. Others have raised serious concerns about the many scientific and societal uncertainties and risks of rewilding. Moreover, despite its growing popularity, rewilding has made only limited inroads within the conservation mainstream and still has to prove itself in practice. Rewilding differs from traditional restoration in at least two important respects. Whereas restoration has typically focused on the recovery of plants communities, rewilding has drawn attention to animals, particularly large carnivores and large herbivores. Whereas restoration aims to return an ecosystem back to some historical condition, rewilding is forward-looking rather than backward-looking: it examines the past not so much to recreate it, but to learn from the past how to activate and maintain the natural processes that are crucial for biodiversity conservation. Rewilding makes use of a variety of techniques to re-establish these natural processes. Besides the familiar method of reintroducing animals in areas where populations have decreased dramatically or even gone extinct, rewilders also employ some more controversial methods, including back breeding to restore wild traits in domesticated species, taxon substitution to replace extinct species by closely related species with similar roles within an ecosystem, and de-extinction to bring extinct species back to life again using advanced biotechnological technologies such as cloning and gene editing. Rewilding has clearly gained the most traction in North America and Europe, which have several key features in common. Both regions have recently experienced a spontaneous return of wildlife. Rewilders on both sides of the Atlantic are aware, however, that this wildlife resurgence is not that impressive, given that we are in the midst of the sixth mass extinction, which is characterized by the loss of large-bodied animals known as megafauna. The common goal is to bring back such megafaunal species because of their importance for maintaining and enhancing biodiversity. Last, both North American and European rewilders perceive the extinction crisis through the lens of the island theory, which shows that the number of species in an area depends on its size and degree of isolation—hence their special attention to the spatial aspects of rewilding. But rewilding projects on both sides of the Atlantic not only have much in common, they also differ in certain aspects. North American rewilders have adopted the late Pleistocene as a reference period and have emphasized the role of predation by large carnivores, while European rewilders have opted for the mid-Holocene and put more focus on naturalistic grazing by large herbivores.

Article

Risk Perception and Its Impacts on Risk Governance  

Ortwin Renn, Andreas Klinke, Pia-Johanna Schweizer, and Ferdiana Hoti

Risk perception is an important component of risk governance, but it cannot and should not determine environmental policies. The reality is that people suffer and even die as a result of false information or perception biases. It is particularly important to be aware of intuitive heuristics and common biases in making inferences from information in a situation where personal or institutional decisions have far-reaching consequences. The gap between risk assessment and risk perception is an important aspect of environmental policy-making. Communicators, risk managers, as well as representatives of the media, stakeholders, and the affected public should be well informed about the results of risk perception and risk response studies. They should be aware of typical patterns of information processing and reasoning when they engage in designing communication programs and risk management measures. At the same time, the potential recipients of information should be cognizant of the major psychological and social mechanisms of perception as a means to avoid painful errors. To reach this goal of mutual enlightenment, it is crucial to understand the mechanisms and processes of how people perceive risks (with emphasis on environmental risks) and how they behave on the basis of their perceptions. Based on the insights from cognitive psychology, social psychology, micro-sociology, and behavioral studies, one can distill some basic lessons for risk governance that reflect universal characteristics of perception and that can be taken for granted in many different cultures and risk contexts. This task of mutual enlightenment on the basis of evidence-based research and investigations is constrained by complexity, uncertainty, and ambiguity in describing, assessing, and analyzing risks, in particular environmental risks. The idea that the “truth” needs to be framed in a way that the targeted audience understands the message is far too simple. In a stochastic and nonlinear understanding of (environmental) risk there are always several (scientifically) legitimate ways of representing scientific insights and causal inferences. Much knowledge in risk and disaster assessment is based on incomplete models, simplified simulations, and expert judgments with a high degree of uncertainty and ambiguity. The juxtaposition of scientific truth, on one hand, and erroneous risk perception, on the other hand, does not reflect the real situation and lends itself to a vision of expertocracy that is neither functionally correct nor democratically justified. The main challenge is to initiate a dialogue that incorporates the limits and uncertainties of scientific knowledge and also starts a learning process by which obvious misperceptions are corrected and the legitimate corridor of interpretation is jointly defined. In essence, expert opinion and lay perception need to be perceived as complementing rather than competing with each other. The very essence of responsible action is to make viable and morally justified decisions in the face of uncertainty based on a range of scientifically legitimate expert assessments. These assessments have to be embedded into the context of criteria for acceptable risks, trade-offs between risks to humans and ecosystems, equitable risk and benefit distribution, and precautionary measures. These criteria most precisely reflect the main concerns revealed by empirical studies on risk perception. Political decision-makers are therefore well advised to collect both ethically justifiable evaluation criteria and standards and the best available systematic knowledge that inform us about the performance of each risk source or disaster-reduction option according to criteria that have been identified and approved in a legitimate due process. Ultimately, decisions on acceptable risks have to be based on a subjective mix of factual evidence, attitudes toward uncertainties, and moral standards.

Article

Risks for Occupational Health Hazards Among Solid Waste Workers  

Mehrad Bastani, Nurcin Celik, and Danielle Coogan

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Environmental Science. Please check back later for the full article. The volume of municipal solid waste produced in the United States has increased by 68% since 1980, up from 151 million to over 254 million tons per year. As the output of municipal waste has grown, more attention has been placed on the occupations associated with waste management. In 2014, the occupation of refuse and recyclable material collection was ranked as the 6th most dangerous job in the United States, with a rate of 27.1 deaths per 100,000 workers. With the revelation of reported exposure statistics among solid waste workers in the United States, the problem of the identification and assessment of occupational health risks among solid waste workers is receiving more consideration. From the generation of waste to its disposal, solid waste workers are exposed to substantial levels of physical, chemical, and biological toxins. Current waste management systems in the United States involve significant risk of contact with waste hazards, highlighting that prevention methods such as monitoring exposures, personal protection, engineering controls, job education and training, and other interventions are under-utilized. To recognize and address occupational hazards encountered by solid waste workers, it is necessary to discern potential safety concerns and their causes, as well as their direct and/or indirect impacts on the various types of workers. In solid waste management, the major industries processing solid waste are introduced as recycling, incineration, landfill, and composting. Thus, the reported exposures and potential occupational health risks need to be identified for workers in each of the aforementioned industries. Then, by acquiring data on reported exposure among solid waste workers, multiple county-level and state-level quantitative assessments for major occupational risks can be conducted using statistical assessment methods. To assess health risks among solid waste workers, the following questions must be answered: How can the methods of solid waste management be categorized? Which are the predominant occupational health risks among solid waste workers, and how can they be identified? Which practical and robust assessment methods are useful for evaluating occupational health risks among solid waste workers? What are possible solutions that can be implemented to reduce the occupational health hazard rates among solid waste workers?

Article

The Role of Cover Crops in Agriculture and Their Environmental Significance  

Helena Aronsson

Growing a cover crop between main crops imitates natural ecosystems where the soil is continuously covered with vegetation. This is an important management practice in preserving soil nutrient resources and reducing nitrogen (N) losses to waters. Cover crops also provide other functions that are important for the resilience and long-term stability of cropping systems, such as reduced erosion, increased soil fertility, carbon sequestration, increased soil phosphorus (P) availability, and suppression of weeds and pathogens. Much is known about how to use cover crops to reduce N leaching, for climates where there is a water surplus outside the growing season. Non-legume cover crops reduce N leaching by 20%–80% and legumes reduce it by, on average, 23%. There are both synergies and possible conflicts between different environmental and production aspects that should be considered when developing efficient and multifunctional cover crop systems, but contradictions about different functions provided by cover crops can sometimes be overcome with site-specific adaptation of measures. One example is cover crop effects on P losses. Cover crops reduce losses of total P, but extract soil P to available forms and may increase losses of dissolved P. How to use this effect to increase soil P availability on subtropical soils needs further studies. Knowledge and examples of how to maximize the positive effects of cover crops on cropping systems are improving, thereby increasing the sustainability of agriculture. One example is combined weed suppression in order to reduce dependence on herbicides or intensive mechanical treatment.

Article

The Role of Tourism in Sustainable Development  

Robert B. Richardson

Sustainable development is the foundational principle for enhancing human and economic development while maintaining the functional integrity of ecological and social systems that support regional economies. Tourism has played a critical role in sustainable development in many countries and regions around the world. In developing countries, tourism development has been used as an important strategy for increasing economic growth, alleviating poverty, creating jobs, and improving food security. Many developing countries are in regions that are characterized by high levels of biological diversity, natural resources, and cultural heritage sites that attract international tourists whose local purchases generate income and support employment and economic development. Tourism has been associated with the principles of sustainable development because of its potential to support environmental protection and livelihoods. However, the relationship between tourism and the environment is multifaceted, as some types of tourism have been associated with negative environmental impacts, many of which are borne by host communities. The concept of sustainable tourism development emerged in contrast to mass tourism, which involves the participation of large numbers of people, often in structured or packaged tours. Mass tourism has been associated with economic leakage and dependence, along with negative environmental and social impacts. Sustainable tourism development has been promoted in various ways as a framing concept in contrast to these economic, environmental, and social impacts. Some literature has acknowledged a vagueness of the concept of sustainable tourism, which has been used to advocate for fundamentally different strategies for tourism development that may exacerbate existing conflicts between conservation and development paradigms. Tourism has played an important role in sustainable development in some countries through the development of alternative tourism models, including ecotourism, community-based tourism, pro-poor tourism, slow tourism, green tourism, and heritage tourism, among others that aim to enhance livelihoods, increase local economic growth, and provide for environmental protection. Although these models have been given significant attention among researchers, the extent of their implementation in tourism planning initiatives has been limited, superficial, or incomplete in many contexts. The sustainability of tourism as a global system is disputed among scholars. Tourism is dependent on travel, and nearly all forms of transportation require the use of non-renewable resources such as fossil fuels for energy. The burning of fossil fuels for transportation generates emissions of greenhouse gases that contribute to global climate change, which is fundamentally unsustainable. Tourism is also vulnerable to both localized and global shocks. Studies of the vulnerability of tourism to localized shocks include the impacts of natural disasters, disease outbreaks, and civil unrest. Studies of the vulnerability of tourism to global shocks include the impacts of climate change, economic crisis, global public health pandemics, oil price shocks, and acts of terrorism. It is clear that tourism has contributed significantly to economic development globally, but its role in sustainable development is uncertain, debatable, and potentially contradictory.

Article

Sea Level Rise and Coastal Management  

James B. London

Coastal zone management (CZM) has evolved since the enactment of the U.S. Coastal Zone Management Act of 1972, which was the first comprehensive program of its type. The newer iteration of Integrated Coastal Zone Management (ICZM), as applied to the European Union (2000, 2002), establishes priorities and a comprehensive strategy framework. While coastal management was established in large part to address issues of both development and resource protection in the coastal zone, conditions have changed. Accelerated rates of sea level rise (SLR) as well as continued rapid development along the coasts have increased vulnerability. The article examines changing conditions over time and the role of CZM and ICZM in addressing increased climate related vulnerabilities along the coast. The article argues that effective adaptation strategies will require a sound information base and an institutional framework that appropriately addresses the risk of development in the coastal zone. The information base has improved through recent advances in technology and geospatial data quality. Critical for decision-makers will be sound information to identify vulnerabilities, formulate options, and assess the viability of a set of adaptation alternatives. The institutional framework must include the political will to act decisively and send the right signals to encourage responsible development patterns. At the same time, as communities are likely to bear higher costs for adaptation, it is important that they are given appropriate tools to effectively weigh alternatives, including the cost avoidance associated with corrective action. Adaptation strategies must be pro-active and anticipatory. Failure to act strategically will be fiscally irresponsible.

Article

Seed Banking as Future Insurance Against Crop Collapses  

Fiona Hay

Food security is dependent on the work of plant scientists and breeders who develop new varieties of crops that are high yielding, nutritious, and tolerate a range of biotic and abiotic stresses. These scientists and breeders need access to novel genetic material to evaluate and to use in their breeding programs; seed- (gene-)banks are the main source of novel genetic material. There are more than 1,750 genebanks around the world that are storing the orthodox (desiccation tolerant) seeds of crops and their wild relatives. These seeds are stored at low moisture content and low temperature to extend their longevity and ensure that seeds with high viability can be distributed to end-users. Thus, seed genebanks serve two purposes: the long-term conservation of plant genetic resources, and the distribution of seed samples. Globally, there are more than 7,400,000 accessions held in genebanks; an accession is a supposedly distinct, uniquely identifiable germplasm sample which represents a particular landrace, variety, breeding line, or population. Genebank staff manage their collections to ensure that suitable material is available and that the viability of the seeds remains high. Accessions are regenerated if viability declines or if stocks run low due to distribution. Many crops come under the auspices of the International Treaty on Plant Genetic Resources for Food and Agriculture and germplasm is shared using the Standard Material Transfer Agreement. The Treaty collates information on the sharing of germplasm with a view to ensuring that farmers ultimately benefit from making their agrobiodiversity available. Ongoing research related to genebanks covers a range of disciplines, including botany, seed and plant physiology, genetics, geographic information science, and law.

Article

Sentinel Species of Marine Ecosystems  

Maria Cristina Fossi and Cristina Panti

A vigorous effort to identify and study sentinel species of marine ecosystem in the world’s oceans has developed over the past 50 years. The One Health concept recognizes that the health of humans is connected to the health of animals and the environment. Species ranging from invertebrate to large marine vertebrates have acted as “sentinels” of the exposure to environmental stressors and health impacts on the environment that may also affect human health. Sentinel species can signal warnings, at different levels, about the potential impacts on a specific ecosystem. These warnings can help manage the abiotic and anthropogenic stressors (e.g., climate change, chemical and microbial pollutants, marine litter) affecting ecosystems, biota, and human health. The effects of exposure to multiple stressors, including pollutants, in the marine environment may be seen at multiple trophic levels of the ecosystem. Attention has focused on the large marine vertebrates, for several reasons. In the past, the use of large marine vertebrates in monitoring and assessing the marine ecosystem has been criticized. The fact that these species are pelagic and highly mobile has led to the suggestion that they are not useful indicators or sentinel species. In recent years, however, an alternative view has emerged: when we have a sufficient understanding of differences in species distribution and behavior in space and time, these species can be extremely valuable sentinels of environmental quality. Knowledge of the status of large vertebrate populations is crucial for understanding the health of the ecosystem and instigating mitigation measures for the conservation of large vertebrates. For example, it is well known that the various cetacean species exhibit different home ranges and occupy different habitats. This knowledge can be used in “hot spot” areas, such as the Mediterranean Basin, where different species can serve as sentinels of marine environmental quality. Organisms that have relatively long life spans (such as cetaceans) allow for the study of chronic diseases, including reproductive alterations, abnormalities in growth and development, and cancer. As apex predators, marine mammals feed at or near the top of the food chain. As the result of biomagnification, the levels of anthropogenic contaminants found in the tissues of top predators and long-living species are typically high. Finally, the application of consistent examination procedures and biochemical, immunological, and microbiological techniques, combined with pathological examination and behavioral analysis, has led to the development of health assessment methods at the individual and population levels in wild marine mammals. With these tools in hand, investigators have begun to explore and understand the relationships between exposures to environmental stressors and a range of disease end points in sentinel species (ranging from invertebrates to marine mammals) as an indicator of ecosystem health and a harbinger of human health and well-being.