1-10 of 10 Results  for:

  • Agriculture and the Environment x
  • Environmental History x
Clear all

Article

Agroforestry and Its Impact in Southeast Asia  

Christopher Hunt

Research during the late 20th and early 21st centuries found that traces of human intervention in vegetation in Southeast Asian and Australasian forests started extremely early, quite probably close to the first colonization of the region by modern people around or before 50,000 years ago. It also identified what may be insubstantial evidence for the translocation of economically important plants during the latest Pleistocene and Early Holocene. These activities may reflect early experiments with plants which evolved into agroforestry. Early in the Holocene, land management/food procurement systems, in which trees were a very significant component, seem to have developed over very extensive areas, often underpinned by dispersal of starchy plants, some of which seem to show domesticated morphologies, although the evidence for this is still relatively insubstantial. These land management/food procurement systems might be regarded as a sort of precursor to agroforestry. Similar systems were reported historically during early Western contact, and some agroforest systems survive to this day, although they are threatened in many places by expansion of other types of land use. The wide range of recorded agroforestry makes categorizing impacts problematical, but widespread disruption of vegetational succession across the region during the Holocene can perhaps be ascribed to agroforestry or similar land-management systems, and in more recent times impacts on biodiversity and geomorphological systems can be distinguished. Impacts of these early interventions in forests seem to have been variable and locally contingent, but what seem to have been agroforestry systems have persisted for millennia, suggesting that some may offer long-term sustainability.

Article

The Forest Transition  

Thomas Rudel

Forest transitions take place when trends over time in forest cover shift from deforestation to reforestation. These transitions are of immense interest to researchers because the shift from deforestation to reforestation brings with it a range of environmental benefits. The most important of these would be an increased volume of sequestered carbon, which if large enough would slow climate change. This anticipated atmospheric effect makes the circumstances surrounding forest transitions of immediate interest to policymakers in the climate change era. This encyclopedia entry outlines these circumstances. It begins by describing the socio-ecological foundations of the first forest transitions in western Europe. Then it discusses the evolution of the idea of a forest transition, from its introduction in 1990 to its latest iteration in 2019. This discussion describes the proliferation of different paths through the forest transition. The focus then shifts to a discussion of the primary driver of the 20th-century forest transitions, economic development, in its urbanizing, industrializing, and globalizing forms. The ecological dimension of the forest transition becomes the next focus of the discussion. It describes the worldwide redistribution of forests toward more upland settings. Climate change since 2000, with its more extreme ecological events in the form of storms and droughts, has obscured some ongoing forest transitions. The final segment of this entry focuses on the role of the state in forest transitions. States have become more proactive in managing forest transitions. This tendency became more marked after 2010 as governments have searched for ways to reduce carbon emissions or to offset emissions through more carbon sequestration. The forest transitions by promoting forest expansion would contribute additional carbon offsets to a nation’s carbon budget. For this reason, the era of climate change could also see an expansion in the number of promoted forest transitions.

Article

Early History of Animal Domestication in Southwest Asia  

Benjamin S. Arbuckle

The domestication of livestock animals has long been recognized as one of the most important and influential events in human prehistory and has been the subject of scholarly inquiry for centuries. Modern understandings of this important transition place it within the context of the origins of food production in the so-called Neolithic Revolution, where it is particularly well documented in southwest Asia. Here, a combination of archaeofaunal, isotopic, and DNA evidence suggests that sheep, goat, cattle, and pigs were first domesticated over a period of several millennia within sedentary communities practicing intensive cultivation beginning at the Pleistocene–Holocene transition. Resulting from more than a century of data collection, our understanding of the chronological and geographic features of the transition from hunting to herding indicate that the 9th millennium bce and the region of the northern Levant played crucial roles in livestock domestication. However, many questions remain concerning the nature of the earliest predomestic animal management strategies, the role of multiple regional traditions of animal management in the emergence of livestock, and the motivations behind the slow spread of integrated livestock husbandry systems, including all four domestic livestock species that become widespread throughout southwest Asia only at the end of the Neolithic period.

Article

Agricultural Dispersals in Mediterranean and Temperate Europe  

Aurélie Salavert

Along with ceramics production, sedentism, and herding, agriculture is a major component of the Neolithic as it is defined in Europe. Therefore, the agricultural system of the first Neolithic societies and the dispersal of exogenous cultivated plants to Europe are the subject of many scientific studies. To work on these issues, archaeobotanists rely on residual plant remains—crop seeds, weeds, and wild plants—from archaeological structures like detritic pits, and, less often, storage contexts. To date, no plant with an economic value has been identified as domesticated in Western Europe except possibly opium poppy. The earliest seeds identified at archaeological sites dated to about 5500–5200 bc in the Mediterranean and Temperate Europe. The cultivated plants identified were cereals (wheat and barley), oleaginous plant (flax), and pulses (peas, lentils, and chickpeas). This crop package originated in the Fertile Crescent, where it was clearly established around 7500 bc (final Pre-Pottery Neolithic B), after a long, polycentric domestication process. From the middle of the 7th millennium bc, via the Balkan Peninsula, the pioneer Neolithic populations, with their specific economies, rapidly dispersed from east to west, following two main pathways. One was the maritime route over the northwestern basin of the Mediterranean (6200–5300 bc), and the other was the terrestrial and fluvial route in central and northwestern continental Europe (5500–4900 bc). On their trajectory, the agropastoral societies adapted the Neolithic founder crops from the Middle East to new environmental conditions encountered in Western Europe. The Neolithic pioneers settled in an area that had experienced a long tradition of hunting and gathering. The Neolithization of Europe followed a colonization model. The Mesolithic groups, although exploiting plant resources such as hazelnut more or less intensively, did not significantly change the landscape. The impact of their settlements and their activities are hardly noticeable through palynology, for example. The control of the mode of reproduction of plants has certainly increased the prevalence of Homo sapiens, involving, among others, a demographic increase and the ability to settle down in areas that were not well adapted to year-round occupation up to that point. The characterization of past agricultural systems, such as crop plants, technical processes, and the impact of anthropogenic activities on the landscape, is essential for understanding the interrelation of human societies and the plant environment. This interrelation has undoubtedly changed deeply with the Neolithic Revolution.

Article

Agricultural Subsidies and the Environment  

Heather Williams

Worldwide, governments subsidize agriculture at the rate of approximately 1 billion dollars per day. This figure rises to about twice that when export and biofuels production subsidies and state financing for dams and river basin engineering are included. These policies guide land use in numerous ways, including growers’ choices of crop and buyers’ demand for commodities. The three types of state subsidies that shape land use and the environment are land settlement programs, price and income supports, and energy and emissions initiatives. Together these subsidies have created perennial surpluses in global stores of cereal grains, cotton, and dairy, with production increases outstripping population growth. Subsidies to land settlement, to crop prices, and to processing and refining of cereals and fiber, therefore, can be shown to have independent and largely deleterious effect on soil fertility, fresh water supplies, biodiversity, and atmospheric carbon.

Article

Human-Environmental Interrelationships and the Origins of Agriculture in Egypt and Sudan  

Simon Holdaway and Rebecca Phillipps

Northeast Africa forms an interesting case study for investigating the relationship between changes in environment and agriculture. Major climatic changes in the early Holocene led to dramatic changes in the environment of the eastern Sahara and to the habitation of previously uninhabitable regions. Research programs in the eastern Sahara have uncovered a wealth of archaeological evidence for sustained occupation during the African Humid Period, from about 11,000 years ago. Initial studies of faunal remains seemed to indicate early shifts in economic practice toward cattle pastoralism. Although this interpretation was much debated when it was first proposed, the possibility of early pastoralism stimulated discussion concerning the relationships between people and animals in particular environmental contexts, and ultimately led to questions concerning the role of agriculture imported from elsewhere in contrast to local developments. Did agriculture, or indeed cultivation and domestication more generally (sensu Fuller & Hildebrand, 2013), develop in North Africa, or were the concepts and species imported from Southwest Asia? And if agriculture did spread from elsewhere, were just the plants and animals involved, or was the shift part of a full socioeconomic suite that included new subsistence strategies, settlement patterns, technologies, and an agricultural “culture”? And finally, was this shift, wherever and however it originated, related to changes in the environment during the early to mid-Holocene? These questions refer to the “big ideas” that archaeologists explore, but before answers can be formed it is important to consider the nature of the material evidence on which they are based. Archaeologists must consider not only what they discover but also what might be missing. Materials from the past are preserved only in certain places, and of course some materials can be preserved better than others. In addition, people left behind the material remains of their activities, but in doing so they did not intend these remains to be an accurate historical record of their actions. Archaeologists need to consider how the remains found in one place may inform us about a range of activities that occurred elsewhere for which the evidence may be less abundant or missing. This is particularly true for Northeast Africa where environmental shifts and consequent changes in resource abundance often resulted in considerable mobility. This article considers the origins of agriculture in the region covering modern-day Egypt and Sudan, paying particular attention to the nature of the evidence from which inferences about past socioeconomies may be drawn.

Article

Indigenous Polynesian Agriculture in Hawaiʻi  

Noa Kekuewa Lincoln and Peter Vitousek

Agriculture in Hawaiʻi was developed in response to the high spatial heterogeneity of climate and landscape of the archipelago, resulting in a broad range of agricultural strategies. Over time, highly intensive irrigated and rainfed systems emerged, supplemented by extensive use of more marginal lands that supported considerable populations. Due to the late colonization of the islands, the pathways of development are fairly well reconstructed in Hawaiʻi. The earliest agricultural developments took advantage of highly fertile areas with abundant freshwater, utilizing relatively simple techniques such as gardening and shifting cultivation. Over time, investments into land-based infrastructure led to the emergence of irrigated pondfield agriculture found elsewhere in Polynesia. This agricultural form was confined by climatic and geomorphological parameters, and typically occurred in wetter, older landscapes that had developed deep river valleys and alluvial plains. Once initiated, these wetland systems saw regular, continuous development and redevelopment. As populations expanded into areas unable to support irrigated agriculture, highly diverse rainfed agricultural systems emerged that were adapted to local environmental and climatic variables. Development of simple infrastructure over vast areas created intensive rainfed agricultural systems that were unique in Polynesia. Intensification of rainfed agriculture was confined to areas of naturally occurring soil fertility that typically occurred in drier and younger landscapes in the southern end of the archipelago. Both irrigated and rainfed agricultural areas applied supplementary agricultural strategies in surrounding areas such as agroforestry, home gardens, and built soils. Differences in yield, labor, surplus, and resilience of agricultural forms helped shape differentiated political economies, hierarchies, and motivations that played a key role in the development of sociopolitical complexity in the islands.

Article

From Plows, Horses, and Harnesses to Precision Technologies in the North American Great Plains  

David E. Clay, Sharon A. Clay, Thomas DeSutter, and Cheryl Reese

Since the discovery that food security could be improved by pushing seeds into the soil and later harvesting a desirable crop, agriculture and agronomy have gone through cycles of discovery, implementation, and innovation. Discoveries have produced predicted and unpredicted impacts on the production and consumption of locally produced foods. Changes in technology, such as the development of the self-cleaning steel plow in the 18th century, provided a critical tool needed to cultivate and seed annual crops in the Great Plains of North America. However, plowing the Great Plains would not have been possible without the domestication of plants and animals and the discovery of the yoke and harness. Associated with plowing the prairies were extensive soil nutrient mining, a rapid loss of soil carbon, and increased wind and water erosion. More recently, the development of genetically modified organisms (GMOs) and no-tillage planters has contributed to increased adoption of conservation tillage, which is less damaging to the soil. In the future, the ultimate impact of climate change on agronomic practices in the North American Great Plains is unknown. However, projected increasing temperatures and decreased rainfall in the southern Great Plains (SGP) will likely reduce agricultural productivity. Different results are likely in the northern Great Plains (NGP) where higher temperatures can lead to increased agricultural intensification, the conversion of grassland to cropland, increased wildlife fragmentation, and increased soil erosion. Precision farming, conservation, cover crops, and the creation of plants better designed to their local environment can help mitigate these effects. However, changing practices require that farmers and their advisers understand the limitations of the soils, plants, and environment, and their production systems. Failure to implement appropriate management practices can result in a rapid decline in soil productivity, diminished water quality, and reduced wildlife habitat.

Article

The Early Anthropogenic Hypothesis  

William Ruddiman

Throughout the 1900s, the warmth of the current interglaciation was viewed as completely natural in origin (prior to greenhouse-gas emissions during the industrial era). In the view of physical scientists, orbital variations had ended the previous glaciation and caused a warmer climate but had not yet brought it to an end. Most historians focused on urban and elite societies, with much less attention to how farmers were altering the land. Historical studies were also constrained by the fact that written records extended back a few hundred to at most 3,500 years. The first years of the new millennium saw a major challenge to the ruling paradigm. Evidence from deep ice drilling in Antarctica showed that the early stages of the three interglaciations prior to the current one were marked by decreases in concentrations of carbon dioxide (CO2) and methane (CH4) that must have been natural in origin. During the earliest part of the current (Holocene) interglaciation, gas concentrations initially showed similar decreases, but then rose during the last 7,000–5,000 years. These anomalous (“wrong-way”) trends are interpreted by many scientists as anthropogenic, with support from scattered evidence of deforestation (which increases atmospheric CO2) by the first farmers and early, irrigated rice agriculture (which emits CH4). During a subsequent interval of scientific give-and-take, several papers have criticized this new hypothesis. The most common objection has been that there were too few people living millennia ago to have had large effects on greenhouse gases and climate. Several land-use simulations estimate that CO2 emissions from pre-industrial forest clearance amounted to just a few parts per million (ppm), far less than the 40 ppm estimate in the early anthropogenic hypothesis. Other critics have suggested that, during the best orbital analog to the current interglaciation, about 400,000 years ago, interglacial warmth persisted for 26,000 years, compared to the 10,000-year duration of the current interglaciation (implying more warmth yet to come). A geochemical index of the isotopic composition of CO2 molecules indicates that terrestrial emissions of 12C-rich CO2 were very small prior to the industrial era. Subsequently, new evidence has once again favored the early anthropogenic hypothesis, albeit with some modifications. Examination of cores reaching deeper into Antarctic ice reconfirm that the upward gas trends in this interglaciation differ from the average downward trends in seven previous ones. Historical data from Europe and China show that early farmers used more land per capita and emitted much more carbon than suggested by the first land-use simulations. Examination of pollen trends in hundreds of European lakes and peat bogs has shown that most forests had been cut well before the industrial era. Mapping of the spread of irrigated rice by archaeobotanists indicates that emissions from rice paddies can explain much of the anomalous CH4 rise in pre-industrial time. The early anthropogenic hypothesis is now broadly supported by converging evidence from a range of disciplines.

Article

Soils, Science, Society, and the Environment  

Colin R. Robins

Soils are the complex, dynamic, spatially diverse, living, and environmentally sensitive foundations of terrestrial ecosystems as well as human civilizations. The modern, environmental study of soil is a truly young scientific discipline that emerged only in the late 19th century from foundations in agricultural chemistry, land resource mapping, and geology. Today, little more than a century later, soil science is a rigorously interdisciplinary field with a wide range of exciting applications in agronomy, ecology, environmental policy, geology, public health, and many other environmentally relevant disciplines. Soils form slowly, in response to five inter-related factors: climate, organisms, topography, parent material, and time. Consequently, many soils are chemically, biologically, and/or geologically unique. The profound importance of soil, combined with the threats of erosion, urban development, pollution, climate change, and other factors, are now prompting soil scientists to consider the application of endangered species concepts to rare or threatened soil around the world.