You are looking at 21-40 of 179 articles
Sumit Sharma, Liliana Nunez, and Veerabhadran Ramanathan
Atmospheric brown clouds (ABCs) are widespread pollution clouds that can at times span an entire continent or an ocean basin. ABCs extend vertically from the ground upward to as high as 3 km, and they consist of both aerosols and gases. ABCs consist of anthropogenic aerosols such as sulfates, nitrates, organics, and black carbon and natural dust aerosols. Gaseous pollutants that contribute to the formation of ABCs are NOx (nitrogen oxides), SOx (sulfur oxides), VOCs (volatile organic compounds), CO (carbon monoxide), CH4 (methane), and O3 (ozone). The brownish color of the cloud (which is visible when looking at the horizon) is due to absorption of solar radiation at short wavelengths (green, blue, and UV) by organic and black carbon aerosols as well as by NOx. While the local nature of ABCs around polluted cities has been known since the early 1900s, the widespread transoceanic and transcontinental nature of ABCs as well as their large-scale effects on climate, hydrological cycle, and agriculture were discovered inadvertently by The Indian Ocean Experiment (INDOEX), an international experiment conducted in the 1990s over the Indian Ocean. A major discovery of INDOEX was that ABCs caused drastic dimming at the surface. The magnitude of the dimming was as large as 10–20% (based on a monthly average) over vast areas of land and ocean regions. The dimming was shown to be accompanied by significant atmospheric absorption of solar radiation by black and brown carbon (a form of organic carbon). Black and brown carbon, ozone and methane contribute as much as 40% to anthropogenic radiative forcing. The dimming by sulfates, nitrates, and carbonaceous (black and organic carbon) species has been shown to disrupt and weaken the monsoon circulation over southern Asia. In addition, the ozone in ABCs leads to a significant decrease in agriculture yields (by as much as 20–40%) in the polluted regions. Most significantly, the aerosols (in ABCs) near the ground lead to about 4 million premature mortalities every year. Technological and regulatory measures are available to mitigate most of the pollution resulting from ABCs. The importance of ABCs to global environmental problems led the United Nations Environment Programme (UNEP) to form the
In 2018 barley accounts for only 5% of the cereal production worldwide, and regionally for up to 40% of cereal production. The cereal represents the oldest crop species and is one of the best adapted crop plants to a broad diversity of climates and environments.
Originating from the wild progenitor species Hordeum vulgare ssp. spontaneum, biogeographically located in the Fertile Crescent of the Near East, the domesticated form developed as a founder crop in aceramic Neolithic societies 11,000 years ago, was cultivated in monocultures in Bronze Age Mesopotamia, entered the New World after 1492
Its stress tolerance in response to increased aridity and salinity on one hand and adaptability to cool climates on the other, partially explains its broad range of applications for subsistence and economy across different cultures, such as for baking, cooking, beer brewing and as an animal feed.
Although the use of fermented starch for producing alcoholic beverages and foods is globally documented in archaeological contexts dating from at least the beginning of the Holocene era, it becomes concrete only in societies with a written culture, such as Bronze Age Mesopotamia and Egypt, where beer played a considerable role in everyday diet and its production represented an important sector of productivity.
In 2004 approximately 85% of barley production was destined for feeding animals. However, as a component of the human diet, studies on the health benefits of the micronutrients in barley have found that it has a positive effect on blood cholesterol and glucose levels, and in turn impacts cardiovascular health and diabetes control. The increasing number of barley-breeding programs worldwide focus on improving the processing characteristics, nutritional value, and stress tolerance of barley within the context of global climate change.
Kevin J. Boyle and Christopher F. Parmeter
Benefit transfer is the projection of benefits from one place and time to another time at the same place or to a new place. Thus, benefit transfer includes the adaptation of an original study to a new policy application at the same location or the adaptation to a different location. The appeal of a benefit transfer is that it can be cost effective, both monetarily and in time. Using previous studies, analysts can select existing results to construct a transferred value for the desired amenity influenced by the policy change. Benefit transfer practices are not unique to valuing ecosystem service and are generally applicable to a variety of changes in ecosystem services. An ideal benefit transfer will scale value estimates to both the ecosystem services and the preferences of those who hold values. The article outlines the steps in a benefit transfer, types of transfers, accuracy of transferred values, and challenges when conducting ecosystem transfers and ends with recommendations for the implementation of benefit transfers to support decision-making.
Lora Fleming, Niccolò Tempini, Harriet Gordon-Brown, Gordon L. Nichols, Christophe Sarran, Paolo Vineis, Giovanni Leonardi, Brian Golding, Andy Haines, Anthony Kessel, Virginia Murray, Michael Depledge, and Sabina Leonelli
Big data refers to large, complex, potentially linkable data from diverse sources, ranging from the genome and social media, to individual health information and the contributions of citizen science monitoring, to large-scale long-term oceanographic and climate modeling and its processing in innovative and integrated “data mashups.” Over the past few decades, thanks to the rapid expansion of computer technology, there has been a growing appreciation for the potential of big data in environment and human health research.
The promise of big data mashups in environment and human health includes the ability to truly explore and understand the “wicked environment and health problems” of the 21st century, from tracking the global spread of the Zika and Ebola virus epidemics to modeling future climate change impacts and adaptation at the city or national level. Other opportunities include the possibility of identifying environment and health hot spots (i.e., locations where people and/or places are at particular risk), where innovative interventions can be designed and evaluated to prevent or adapt to climate and other environmental change over the long term with potential (co-) benefits for health; and of locating and filling gaps in existing knowledge of relevant linkages between environmental change and human health. There is the potential for the increasing control of personal data (both access to and generation of these data), benefits to health and the environment (e.g., from smart homes and cities), and opportunities to contribute via citizen science research and share information locally and globally.
At the same time, there are challenges inherent with big data and data mashups, particularly in the environment and human health arena. Environment and health represent very diverse scientific areas with different research cultures, ethos, languages, and expertise. Equally diverse are the types of data involved (including time and spatial scales, and different types of modeled data), often with no standardization of the data to allow easy linkage beyond time and space variables, as data types are mostly shaped by the needs of the communities where they originated and have been used. Furthermore, these “secondary data” (i.e., data re-used in research) are often not even originated for this purpose, a particularly relevant distinction in the context of routine health data re-use. And the ways in which the research communities in health and environmental sciences approach data analysis and synthesis, as well as statistical and mathematical modeling, are widely different.
There is a lack of trained personnel who can span these interdisciplinary divides or who have the necessary expertise in the techniques that make adequate bridging possible, such as software development, big data management and storage, and data analyses. Moreover, health data have unique challenges due to the need to maintain confidentiality and data privacy for the individuals or groups being studied, to evaluate the implications of shared information for the communities affected by research and big data, and to resolve the long-standing issues of intellectual property and data ownership occurring throughout the environment and health fields. As with other areas of big data, the new “digital data divide” is growing, where some researchers and research groups, or corporations and governments, have the access to data and computing resources while others do not, even as citizen participation in research initiatives is increasing. Finally with the exception of some business-related activities, funding, especially with the aim of encouraging the sustainability and accessibility of big data resources (from personnel to hardware), is currently inadequate; there is widespread disagreement over what business models can support long-term maintenance of data infrastructures, and those that exist now are often unable to deal with the complexity and resource-intensive nature of maintaining and updating these tools.
Nevertheless, researchers, policy makers, funders, governments, the media, and members of the general public are increasingly recognizing the innovation and creativity potential of big data in environment and health and many other areas. This can be seen in how the relatively new and powerful movement of Open Data is being crystalized into science policy and funding guidelines. Some of the challenges and opportunities, as well as some salient examples, of the potential of big data and big data mashup applications to environment and human health research are discussed.
Human activities in the Anthropocene are influencing the twin processes of biodiversity generation and loss in complex ways that threaten the maintenance of biodiversity levels that underpin human well-being. Yet many scientists and practitioners still present a simplistic view of biodiversity as a static stock rather than one determined by a dynamic interplay of feedback processes that are affected by anthropogenic drivers. Biodiversity describes the variety of life on Earth, from the genes within an organism to the ecosystem level. However, this article focuses on variation among living organisms, both within and between species. Within species, biodiversity is reflected in genetic, and consequent phenotypic, variations among individuals. Genetic diversity is generated by germ line mutations, genetic recombination during sexual reproduction, and immigration of new genotypes into populations. Across species, biodiversity is reflected in the number of different species present and also, by some metrics, in the evenness of their relative abundance. At this level, biodiversity is generated by processes of speciation and immigration of new species into an area. Anthropogenic drivers affect all these biodiversity generation processes, while the levels of genetic diversity can feed back and affect the level of species diversity, and vice versa. Therefore, biodiversity maintenance is a complex balance of processes and the biodiversity levels at any point in time may not be at equilibrium.
A major concern for humans is that our activities are driving rapid losses of biodiversity, which outweigh by orders of magnitude the processes of biodiversity generation. A wide range of species and genetic diversity could be necessary for the provision of ecosystem functions and services (e.g., in maintaining the nutrient cycling, plant productivity, pollination, and pest control that underpin crop production). The importance of biodiversity becomes particularly marked over longer time periods, and especially under varying environmental conditions.
In terms of biodiversity losses, there are natural processes that cause roughly continuous, low-level losses, but there is also strong evidence from fossil records for transient events in which exceptionally large loss of biodiversity has occurred. These major extinction episodes are thought to have been caused by various large-scale environmental perturbations, such as volcanic eruptions, sea-level falls, climatic changes, and asteroid impacts. From all these events, biodiversity has shown recovery over subsequent calmer periods, although the composition of higher-level evolutionary taxa can be significantly altered.
In the modern era, biodiversity appears to be undergoing another mass extinction event, driven by large-scale human impacts. The primary mechanisms of biodiversity loss caused by humans vary over time and by geographic region, but they include overexploitation, habitat loss, climate change, pollution (e.g., nitrogen deposition), and the introduction of non-native species. It is worth noting that human activities may also lead to increases in biodiversity in some areas through species introductions and climatic changes, although these overall increases in species richness may come at the cost of loss of native species, and with uncertain effects on ecosystem service delivery. Genetic diversity is also affected by human activities, with many examples of erosion of diversity through crop and livestock breeding or through the decline in abundance of wild species populations. Significant future challenges are to develop better ways to monitor the drivers of biodiversity loss and biodiversity levels themselves, making use of new technologies, and improving coverage across geographic regions and taxonomic scope. Rather than treating biodiversity as a simple stock at equilibrium, developing a deeper understanding of the complex interactions—both between environmental drivers and between genetic and species diversity—is essential to manage and maintain the benefits that biodiversity delivers to humans, as well as to safeguard the intrinsic value of the Earth’s biodiversity for future generations.
Peter Kareiva and Isaac Kareiva
The concept of biodiversity hotspots arose as a science-based framework with which to identify high-priority areas for habitat protection and conservation—often in the form of nature reserves. The basic idea is that with limited funds and competition from humans for land, we should use range maps and distributional data to protect areas that harbor the greatest biodiversity and that have experienced the greatest habitat loss. In its early application, much analysis and scientific debate went into asking the following questions: Should all species be treated equally? Do endemic species matter more? Should the magnitude of threat matter? Does evolutionary uniqueness matter? And if one has good data on one broad group of organisms (e.g., plants or birds), does it suffice to focus on hotspots for a few taxonomic groups and then expect to capture all biodiversity broadly? Early applications also recognized that hotspots could be identified at a variety of spatial scales—from global to continental, to national to regional, to even local. Hence, within each scale, it is possible to identify biodiversity hotspots as targets for conservation.
In the last 10 years, the concept of hotspots has been enriched to address some key critiques, including the problem of ignoring important areas that might have low biodiversity but that certainly were highly valued because of charismatic wild species or critical ecosystem services. Analyses revealed that although the spatial correlation between high-diversity areas and high-ecosystem-service areas is low, it is possible to use quantitative algorithms that achieve both high protection for biodiversity and high protection for ecosystem services without increasing the required area as much as might be expected.
Currently, a great deal of research is aimed at asking about what the impact of climate change on biodiversity hotspots is, as well as to what extent conservation can maintain high biodiversity in the face of climate change. Two important approaches to this are detailed models and statistical assessments that relate species distribution to climate, or alternatively “conserving the stage” for high biodiversity, whereby the stage entails regions with topographies or habitat heterogeneity of the sort that is expected to generate high species richness.
Finally, conservation planning has most recently embraced what is in some sense the inverse of biodiversity hotspots—what we might call conservation wastelands. This approach recognizes that in the Anthropocene epoch, human development and infrastructure are so vast that in addition to using data to identify biodiversity hotspots, we should use data to identify highly degraded habitats and ecosystems. These degraded lands can then become priority development areas—for wind farms, solar energy facilities, oil palm plantations, and so forth. By specifying degraded lands, conservation plans commonly pair maps of biodiversity hotspots with maps of degraded lands that highlight areas for development. By putting the two maps together, it should be possible to achieve much more effective conservation because there will be provision of habitat for species and for economic development—something that can obtain broader political support than simply highlighting biodiversity hotspots.
Although the concept of biodiversity emerged 30 years ago, patterns and processes influencing ecological diversity have been studied for more than a century. Historically, ecological processes tended to be considered as occurring in local habitats that were spatially homogeneous and temporally at equilibrium. Initially considered as a constraint to be avoided in ecological studies, spatial heterogeneity was progressively recognized as critical for biodiversity. This resulted, in the 1970s, in the emergence of a new discipline, landscape ecology, whose major goal is to understand how spatial and temporal heterogeneity influence biodiversity. To achieve this goal, researchers came to realize that a fundamental issue revolves around how they choose to conceptualize and measure heterogeneity. Indeed, observed landscape patterns and their apparent relationship with biodiversity often depend on the scale of observation and the model used to describe the landscape. Due to the strong influence of island biogeography, landscape ecology has focused primarily on spatial heterogeneity. Several landscape models were conceptualized, allowing for the prediction and testing of distinct but complementary effects of landscape heterogeneity on species diversity. We now have ample empirical evidence that patch structure, patch context, and mosaic heterogeneity all influence biodiversity. More recently, the increasing recognition of the role of temporal scale has led to the development of new conceptual frameworks acknowledging that landscapes are not only heterogeneous but also dynamic. The current challenge remains to truly integrate both spatial and temporal heterogeneity in studies on biodiversity. This integration is even more challenging when considering that biodiversity often responds to environmental changes with considerable time lags, and multiple drivers of global changes are interacting, resulting in non-additive and sometimes antagonistic effects. Recent technological advances in remote sensing, the availability of massive amounts of data, and long-term studies represent, however, very promising avenues to improve our understanding of how spatial and temporal heterogeneity influence biodiversity.
James M. MacDonald
Industrialized livestock production can be characterized by five key attributes: confinement feeding of animals, separation of feed and livestock production, specialization, large size, and close vertical linkages with buyers. Industrialized livestock operations—popularly known as CAFOs, for Concentrated Animal Feeding Operations—have spread rapidly in developed and developing countries; by the early 21st century, they accounted for three quarters of poultry production and over half of global pork production, and held a growing foothold in dairy production.
Industrialized systems have created significant improvements in agricultural productivity, leading to greater output of meat and dairy products for given commitments of land, feed, labor, housing, and equipment. They have also been effective at developing, applying, and disseminating research leading to persistent improvements in animal genetics, breeding, feed formulations, and biosecurity. The reduced prices associated with productivity improvements support increased meat and dairy product consumption in low and middle income countries, while reducing the resources used for such consumption in higher income countries.
The high-stocking densities associated with confined feeding also exacerbate several social costs associated with livestock production. Animals in high-density environments may be exposed to diseases, subject to attacks from other animals, and unable to engage in natural behaviors, raising concerns about higher levels of fear, pain, stress, and boredom. Such animal welfare concerns have realized greater salience in recent years.
By consolidating large numbers of animals in a location, industrial systems also concentrate animal wastes, often in levels that exceed the capacity of local cropland to absorb the nutrients in manure. While the productivity improvements associated with industrial systems reduce the resource demands of agriculture, excessive localized concentrations of manure can lean to environmental damage through contamination of ground and surface water and through volatilization of nitrogen nutrients into airborne pollutants.
Finally, animals in industrialized systems are often provided with antibiotics in their feed or water, in order to treat and prevent disease, but also to realize improved feed absorption (“a production purpose”). Bacteria are developing resistance to many important antibiotic drugs; the extensive use of such drugs in human and animal medicine has contributed to the spread of antibiotic resistance, with consequent health risks to humans.
The social costs associated with industrialized production have led to a range of regulatory interventions, primarily in North America and Europe, as well as private sector attempts to alter the incentives that producers face through the development of labels and through associated adjustments within supply chains.
Elisabeth N. Bui
Driving forces for natural soil salinity and alkalinity are climate, rock weathering, ion exchange, and mineral equilibria reactions that ultimately control the chemical composition of soil and water. The major weathering reactions that produce soluble ions are tabled. Where evapotranspiration is greater than precipitation, downward water movement is insufficient to leach solutes out of the soil profile and salts can precipitate. Microbes involved in organic matter mineralization and thus the carbon, nitrogen, and sulfur biogeochemical cycles are also implicated. Seasonal contrast and evaporative concentration during dry periods accelerate short-term oxidation-reduction reactions and local and regional accumulation of carbonate and sulfur minerals. The presence of salts and alkaline conditions, together with the occurrence of drought and seasonal waterlogging, creates some of the most extreme soil environments where only specially adapted organisms are able to survive. Sodic soils are alkaline, rich in sodium carbonates, with an exchange complex dominated by sodium ions. Such sodic soils, when low in other salts, exhibit dispersive behavior, and they are difficult to manage for cropping. Maintaining the productivity of sodic soils requires control of the flocculation-dispersion behavior of the soil. Poor land management can also lead to anthropogenically induced secondary salinity. New developments in physical chemistry are providing insights into ion exchange and how it controls flocculation-dispersion in soil. New water and solute transport models are enabling better options of remediation of saline and/or sodic soils.
Vic Adamowicz and Diane Dupont
A number of challenges are faced by practitioners seeking to elicit values associated with water in a world of global change. These values are needed to assist in decision-making around the use of water as a country’s key asset. Five different pathways show the complexity of the relationship between global change and environmental valuation of water: a climate change pathway, ecosystem infrastructure pathway, population/demographics pathway, income pathway, and technological change/innovation pathway. The challenges are most acute for water when it is related to ecosystem services since values need to be elicited through the use of non-market survey-based valuation techniques. In addition, environmental valuation will be important to inform the determination of water quality standards associated with different uses of water (drinking, recreation, etc.) and the allocation of resources to provide these different services. Several case studies illustrate issues and solutions. The article concludes with an appreciation of future challenges and opportunities.
Mental and behavioral disorders account for approximately 7.4% of the global burden of disease, with depression now the world’s leading cause of disability. One in four people in the world will suffer from a mental health problem at some point in their life. City planning and design holds much promise for reducing this burden of disease, and for offering solutions that are affordable, accessible and equitable. Increasingly urban green space is recognized as an important social determinant of health, with the potential to protect mental health – for example, by buffering against life stressors - as well as relieving the symptom severity of specific psychiatric disorders. Pathways linking urban green space with mental wellbeing include the ability of natural stimuli – trees, water, light patterns – to promote ‘involuntary attention’ allowing the brain to disengage and recover from cognitive fatigue. This article brings together evidence of the positive effects of urban green space on common mental health problems (i.e. stress, anxiety, depression) together with evidence of its role in the symptom relief of specific psychiatric disorders, including schizophrenia and psychosis, post-traumatic stress disorder (PTSD), dementia, attention deficit/hyperactivity Disorder (ADHD) and autism. Urban green space is a potential force for building mental health: city planners, urban designers, policy makers and public health professionals need to maximize the opportunities in applying green space strategies for both health prevention and in supporting treatment of mental ill health.
Paolo Inglese and Giuseppe Sortino
In May, every year since 1857, in the great park of Sans-Souci in Potsdam just outside Berlin—a park begun in 1745 by Emperor Frederick II of Hohenzollern and expanded a century later by Frederick William IV—the doors of the great Orangerie open in and a Renaissance-style garden called Sizilianischer Garten is set up. On horse-drawn carriages, large olive and citrus trees are brought outdoors, and are then raised in masters.
For the young European who, in the second half of the 18th century and in the first decades of the following, traveled to Italy to see and study Renaissance culture and the remains of Greek civilization, the citrus species and fruits and groves of southern Italy became the ultimate symbol of beauty and a sort of status symbol of wealth, particularly that of landowners. Nothing is more expressive of the fascination of their fruit than Abu-l-Hasan Ali’s 12th-century writings: “Come on, enjoy your harvested orange: happiness is present when it is present. / Welcome the cheeks of the branches, and welcome the stars of the trees! / It seems that the sky has lavished gold and that the earth has formed some shiny spheres.”
Indeed, Citrus spp. are among the most important crops and consumed fruit worldwide. Their co-evolution due to a millennial agricultural utilization resulted in a complexity of species and cultivated varieties derived by natural or induced mutations, crossing and breeding the “original” species (Citrus medica, Citrus maxima, Citrus reticulate, Fortunella japonica) and their main progenies (C. aurantium, C. sinensis, Citrus limon, Citrus paradisi, Citrus clementina, etc.). Citrus spread from the original tropical and subtropical regions of southeast Asia toward the Mediterranean countries of Europe and North Africa and, after 1492, in the Americas, not to mention South Africa and Australia, where they still have a very important role. Citrus species, wherever they have been cultivated, quickly became the protagonists of the letters and the arts, as well as the markets and gastronomy, and can even be found in religious ceremonies, such as for Feast of Tabernacles (Sukkot). Studies on Citrus botany, cultivation, and utilization have been pursued since the early stages of the fruit’s domestication and grew following their introduction in Europe, the Americas, Africa, and Australia. Citrus research involves many different aspects: such as the study of citrus origin and botanical classification; citrus growing, propagation, and orchard management; citrus fruit quality, utilization and industry; citrus gardening and ornamentals; citrus in arts and manufacturing.
Soil salinity has been causing problems for agriculturists for millennia, primarily in irrigated lands. The importance of salinity issues is increasing, since large areas are affected by irrigation-induced salt accumulation. A wide knowledge base has been collected to better understand the major processes of salt accumulation and choose the right method of mitigation. There are two major types of soil salinity that are distinguished because of different properties and mitigation requirements. The first is caused mostly by the large salt concentration and is called saline soil, typically corresponding to Solonchak soils. The second is caused mainly by the dominance of sodium in the soil solution or on the soil exchange complex. This latter type is called “sodic” soil, corresponding to Solonetz soils. Saline soils have homogeneous soil profiles with relatively good soil structure, and their appropriate mitigation measure is leaching. Naturally sodic soils have markedly different horizons and unfavorable physical properties, such as low permeability, swelling, plasticity when wet, and hardness when dry, and their limitation for agriculture is mitigated typically by applying gypsum. Salinity and sodicity need to be chemically quantified before deciding on the proper management strategy. The most complex management and mitigation of salinized irrigated lands involves modern engineering including calculations of irrigation water rates and reclamation materials, provisions for drainage, and drainage disposal. Mapping-oriented soil classification was developed for naturally saline and sodic soils and inherited the first soil categories introduced more than a century ago, such as Solonchak and Solonetz in most of the total of 24 soil classification systems used currently. USDA Soil Taxonomy is one exception, which uses names composed of formative elements.
Confidence in the projected impacts of climate change on agricultural systems has increased substantially since the first Intergovernmental Panel on Climate Change (IPCC) reports. In Africa, much work has gone into downscaling global climate models to understand regional impacts, but there remains a dearth of local level understanding of impacts and communities’ capacity to adapt. It is well understood that Africa is vulnerable to climate change, not only because of its high exposure to climate change, but also because many African communities lack the capacity to respond or adapt to the impacts of climate change. Warming trends have already become evident across the continent, and it is likely that the continent’s 2000 mean annual temperature change will exceed +2°C by 2100. Added to this warming trend, changes in precipitation patterns are also of concern: Even if rainfall remains constant, due to increasing temperatures, existing water stress will be amplified, putting even more pressure on agricultural systems, especially in semiarid areas. In general, high temperatures and changes in rainfall patterns are likely to reduce cereal crop productivity, and new evidence is emerging that high-value perennial crops will also be negatively impacted by rising temperatures. Pressures from pests, weeds, and diseases are also expected to increase, with detrimental effects on crops and livestock.
Much of African agriculture’s vulnerability to climate change lies in the fact that its agricultural systems remain largely rain-fed and underdeveloped, as the majority of Africa’s farmers are small-scale farmers with few financial resources, limited access to infrastructure, and disparate access to information. At the same time, as these systems are highly reliant on their environment, and farmers are dependent on farming for their livelihoods, their diversity, context specificity, and the existence of generations of traditional knowledge offer elements of resilience in the face of climate change. Overall, however, the combination of climatic and nonclimatic drivers and stressors will exacerbate the vulnerability of Africa’s agricultural systems to climate change, but the impacts will not be universally felt. Climate change will impact farmers and their agricultural systems in different ways, and adapting to these impacts will need to be context-specific.
Current adaptation efforts on the continent are increasing across the continent, but it is expected that in the long term these will be insufficient in enabling communities to cope with the changes due to longer-term climate change. African famers are increasingly adopting a variety of conservation and agroecological practices such as agroforestry, contouring, terracing, mulching, and no-till. These practices have the twin benefits of lowering carbon emissions while adapting to climate change as well as broadening the sources of livelihoods for poor farmers, but there are constraints to their widespread adoption. These challenges vary from insecure land tenure to difficulties with knowledge-sharing.
While African agriculture faces exposure to climate change as well as broader socioeconomic and political challenges, many of its diverse agricultural systems remain resilient. As the continent with the highest population growth rate, rapid urbanization trends, and rising GDP in many countries, Africa’s agricultural systems will need to become adaptive to more than just climate change as the uncertainties of the 21st century unfold.
In 1945 the Amazon biome was almost intact. Marks of ancient cultural developments in Andean and lowland Amazon had cicatrized and the impacts of rubber and more recent resources exploitation were reversible. Very few roads existed, and only on the Amazon’s periphery. However, from the 1950s, but especially in the 1960s, Brazil and some Andean countries launched ambitious road-building and colonization processes. Amazon occupation heavily intensified in the 1970s when forest losses began to raise worldwide concern. More roads continued to be built at a geometrically growing pace in every following decade, multiplying correlated deforestation and forest degradation. A no-return point was reached when interoceanic roads crossed the Brazilian-Andean border in the 2000s, exposing remaining safe havens for indigenous people and nature. It is commonly estimated that today no less than 18% of the forest has been substituted by agriculture and that over 60% of that remaining has been significantly degraded.
Theories regarding the importance of biogeochemical cycles have been developed since the 1970s. The confirmation of the role of the Amazon as a carbon sink added some international pressure for its protection. But, in general, the many scientific discoveries regarding the Amazon have not helped to improve its conservation. Instead, a combination of new agricultural technologies, anthropocentric philosophies, and economic changes strongly promoted forest clearing.
Since the 1980s and as of today Amazon conservation efforts have been increasingly diversified, covering five theoretically complementary strategies: (a) more, larger, and better-managed protected areas; (b) more and larger indigenous territories; (c) a series of “sustainable-use” options such as “community-based conservation,” sustainable forestry, and agroforestry; (d) financing of conservation through debt swaps and climate change’s related financial mechanisms; and (e) better legislation and monitoring. Only five small protected areas have existed in the Amazon since the early 1960s but, responding to the road-building boom of the 1970s, several larger patches aiming at conserving viable samples of biological diversity were set aside, principally in Brazil and Peru. Today around 22% of the Amazon is protected but almost half of such areas correspond to categories that allow human presence and resources exploitation, and there is no effective management. Another 28% or more pertains to indigenous people who may or may not conserve the forest. Both types of areas together cover over 45% of the Amazon. None of the strategies, either alone or in conjunction, have fully achieved their objectives, while development pressures and threats multiply as roads and deforestation continue relentlessly, with increasing funding by multilateral and national banks and due to the influence of transnational enterprises.
The future is likely to see unprecedented agriculture expansion and corresponding intensification of deforestation and forest degradation even in protected areas and indigenous land. Additionally, the upper portion of the Amazon basin will be impacted by new, larger hydraulic works. Mining, formal as well as illegal, will increase and spread. Policymakers of Amazon countries still view the region as an area in which to expand conventional development while the South American population continues to be mostly indifferent to Amazon conservation.
Charles A. Francis
Adaptation of cropping systems to weather uncertainty and climate change is essential for resilient food production and long-term food security. Changes in climate result in substantial temporal modifications of cropping conditions, and rainfall and temperature patterns vary greatly with location. These challenges come at a time when global human population and demand for food are both increasing, and it appears to be difficult to find ways to satisfy growing needs with conventional systems of production. Agriculture in the future will need to feature greater biodiversity of crop species and appropriate design and management of cropping and integrated crop/animal systems. More diverse and longer-cycle crop rotations will need to combine sequences of annual row crops such as maize and soybean with close-drilled cereals, shallow-rooted with deep-rooted crops, summer crops with winter crops, and annuals with perennials in the same fields. Resilience to unpredictable weather will also depend on intercropping, with the creative arrangement of multiple interacting crop species to diversify the field and the landscape. Other multiple-cropping systems and strategies to integrate animals and crops will make more efficient use of natural resources and applied inputs; these include systems such as permaculture, agroforestry, and alley cropping. Future systems will be spatially diverse and adapted to specific fields, soil conditions, and unique agroecozones. Production resilience will be achieved by planting diverse combinations of species together in the same field, and economic resilience through producing a range of products that can be marketed through different channels. The creation of local food webs will be more appropriate in the future, as contrasted with the dominance of global food chains today. Materials considered “waste” from the food system, including human urine and feces, will become valuable resources to be cycled back into the natural environment and into food production. Due to the increasing scarcity of fertile land, the negative contributions of chemicals to environmental pollution, the costs of fossil fuels, and the potential for the economic and political disruption of supply chains, future systems will increasingly need to be local in character while still achieving adaptation to the most favorable conditions for each system and location. It is essential that biologically and economically resilient systems become productive and profitable, as well as environmentally sound and socially equitable, in order to contribute to stability of food production, security of the food supply, and food sovereignty, to the extent that this is possible. The food system cannot continue along the lines of “business as usual,” and its path will need to radically diverge from the recognized trends toward specialization and globalization of the early 21st century. The goal needs to shift from exploitation and short-term profits to conservation of resources, greater equity in distribution of benefits, and resilience in food supply, even with global climate change.
Muhammad Farooq, Ahmad Nawaz, and Faisal Nadeem
Planned crop rotation offers a pragmatic option to improve soil fertility, manage insect pests and diseases, and offset the emission of greenhouse gases. The inclusion of legume crops in crop rotations helps to reduce the use of external nitrogen inputs for legumes and other crops because legumes may fix the atmospheric nitrogen. This also helps to reduce the environmental pollution caused by volatilization and leaching of applied nitrogen. The inclusion of allelopathic crops in rotation may be useful to suppress noxious weeds due to release of the allelochemicals in the rhizosphere. The rotation of tap-rooted crops with shallow rooted crops may result in efficient and productive use of nutrient resources and conservation of soil moisture. Continuous monoculture systems may cause the loss of biodiversity. Land fallowing is an efficient agricultural management technique mostly practiced in arid regions to capture rainwater and store it in the soil profile for later use in crop production. During fallowing, tillage operations are practiced to enhance moisture conservation in the soil. Keeping soil fallow for a season or more restores soil fertility through nutrient deposits; increases organic matter, microbial carbon, and soil microbial diversity; and improves the soil’s physical properties, including aggregation stability and reduced soil compaction due to decreased traffic. In addition, fallowing of land provides biological means of pest (weeds and insects) control by disrupting the life cycle of pests and decreasing reliance on pesticides. Land fallowing can help offset the emission of greenhouse gases from agricultural fields by reducing traffic and increasing carbon sequestration within the soil. Summer fallowing may help to preserve moisture in diverse soil types in the rainfed regions of the world, although it may reduce the carbon sequestration potential of soils over the long term. Energy resources are decreasing, and the inclusion of energy crops in crop rotation may be highly beneficial. Many of the processes, factors, and mechanisms involved in crop rotation and land fallowing are poorly understood and require further investigation.
Corn ranks first among crops in quantity produced globally, owing to its high yield and to its value as a food for humans and domestic animals. While its water-use efficiency is high compared to that of other crops, the production of high corn yields requires a great deal of water; the availability of water largely determines where the crop is grown. As a high-yielding grass species, corn also requires a substantial supply of nutrients (especially nitrogen) from external sources, including manufactured fertilizers and organic materials such as animal or green manures. This, along with the need to manage soils, weeds, insects, and diseases, makes corn production environmentally consequential.
Corn captures large quantities of sunlight energy through photosynthesis, but its production requires large external inputs of energy, coming mostly (in mechanized production) from fossil fuels. So even though the crop’s high yields moderates the environmental cost per unit of grain produced, minimizing the external environmental consequences of large-scale corn production is an important goal in the quest for greater sustainability of production of this important crop.
Shu Ting Chang and Solomon P. Wasser
The word mushroom may mean different things to different people in different countries. Specialist studies on the value of mushrooms and their products should have a clear definition of the term mushroom. In a broad sense, “Mushroom is a distinctive fruiting body of a macrofungus, which produce spores that can be either epigeous or hypogeous and large enough to be seen with the naked eye and to be picked by hand.” Thus, mushrooms need not be members of the group Basidiomycetes, as commonly associated, nor aerial, nor fleshy, nor edible. This definition is not perfect, but it has been accepted as a workable term to estimate the number of mushrooms on Earth (approximately 16,000 species according to the rules of International Code of Nomenclature). The most cultivated mushrooms are saprophytes and are heterotrophic for carbon compounds. Even though their cells have walls, they are devoid of chlorophyll and cannot perform photosynthesis. They are also devoid of vascular xylem and phloem. Furthermore, their cell walls contain chitin, which also occurs in the exoskeleton of insects and other arthropods. They absorb O2 and release CO2. In fact, they may be functionally more closely related to animal cells than plants. However, they are sufficiently distinct both from plants and animals and belong to a separate group in the Fungi Kingdom. They rise up from lignocellulosic wastes: yet, they become bountiful and nourishing. Mushrooms can greatly benefit environmental conditions. They biosynthesize their own food from agricultural crop residues, which, like solar energy, are readily available; otherwise, their byproducts and wastes would cause health hazards. The spent compost/substrate could be used to grow other species of mushrooms, as fodder for livestock, as a soil conditioner and fertilizer, and in environmental bioremediation. The cultivation of mushrooms dates back many centuries; Auricularia auricula-judae, Lentinula edodes, and Agaricus bisporus have, for example, been cultivated since 600
Mushrooms can be used as food, tonics, medicines, cosmeceuticals, and as natural biocontrol agents in plant protection with insecticidal, fungicidal, bactericidal, herbicidal, nematocidal, and antiphytoviral activities. The multidimensional nature of the global mushroom cultivation industry, its role in addressing critical issues faced by humankind, and its positive contributions are presented. Furthermore, mushrooms can serve as agents for promoting equitable economic growth in society. Since the lignocellulose wastes are available in every corner of the world, they can be properly used in the cultivation of mushrooms, and therefore could pilot a so-called white agricultural revolution in less developed countries and in the world at large. Mushrooms demonstrate a great impact on agriculture and the environment, and they have great potential for generating a great socio-economic impact in human welfare on local, national, and global levels.
Assessing the environmental footprints of modern agriculture requires a balanced approach that sets the obviously negative effects (e.g., incidents with excessive use of inputs) against benefits stemming from increased resource use efficiencies. In the case of rice production, the regular flooding of fields comprises a distinctive feature, as compared to other crops, which directly or indirectly affects diverse impacts on the environment. In the regional context of Southeast Asia, rice production is characterized by dynamic changes in terms of crop management practices, so that environmental footprints can only be assessed from time-dependent developments rather than from a static view. The key for the Green Revolution in rice was the introduction of high-yielding varieties in combination with a sufficient water and nutrient supply as well as pest management. More recently, mechanization has evolved as a major trend in modern rice production. Mechanization has diverse environmental impacts and may also be instrumental in tackling the most drastic pollution source from rice production, namely, open field burning of straw. As modernization of rice production is imperative for future food supplies, there is scope for developing sustainable and high-yielding rice production systems by capitalizing on the positive aspects of modernization from a local to a global scale.