Along with ceramics production, sedentism, and herding, agriculture is a major component of the Neolithic as it is defined in Europe. Therefore, the agricultural system of the first Neolithic societies and the dispersal of exogenous cultivated plants to Europe are the subject of many scientific studies. To work on these issues, archaeobotanists rely on residual plant remains—crop seeds, weeds, and wild plants—from archaeological structures like detritic pits, and, less often, storage contexts. To date, no plant with an economic value has been identified as domesticated in Western Europe except possibly opium poppy. The earliest seeds identified at archaeological sites dated to about 5500–5200 bc in the Mediterranean and Temperate Europe. The cultivated plants identified were cereals (wheat and barley), oleaginous plant (flax), and pulses (peas, lentils, and chickpeas). This crop package originated in the Fertile Crescent, where it was clearly established around 7500 bc (final Pre-Pottery Neolithic B), after a long, polycentric domestication process. From the middle of the 7th millennium bc, via the Balkan Peninsula, the pioneer Neolithic populations, with their specific economies, rapidly dispersed from east to west, following two main pathways. One was the maritime route over the northwestern basin of the Mediterranean (6200–5300 bc), and the other was the terrestrial and fluvial route in central and northwestern continental Europe (5500–4900 bc). On their trajectory, the agropastoral societies adapted the Neolithic founder crops from the Middle East to new environmental conditions encountered in Western Europe. The Neolithic pioneers settled in an area that had experienced a long tradition of hunting and gathering. The Neolithization of Europe followed a colonization model. The Mesolithic groups, although exploiting plant resources such as hazelnut more or less intensively, did not significantly change the landscape. The impact of their settlements and their activities are hardly noticeable through palynology, for example. The control of the mode of reproduction of plants has certainly increased the prevalence of Homo sapiens, involving, among others, a demographic increase and the ability to settle down in areas that were not well adapted to year-round occupation up to that point. The characterization of past agricultural systems, such as crop plants, technical processes, and the impact of anthropogenic activities on the landscape, is essential for understanding the interrelation of human societies and the plant environment. This interrelation has undoubtedly changed deeply with the Neolithic Revolution.
Worldwide, governments subsidize agriculture at the rate of approximately 1 billion dollars per day. This figure rises to about twice that when export and biofuels production subsidies and state financing for dams and river basin engineering are included. These policies guide land use in numerous ways, including growers’ choices of crop and buyers’ demand for commodities. The three types of state subsidies that shape land use and the environment are land settlement programs, price and income supports, and energy and emissions initiatives. Together these subsidies have created perennial surpluses in global stores of cereal grains, cotton, and dairy, with production increases outstripping population growth. Subsidies to land settlement, to crop prices, and to processing and refining of cereals and fiber, therefore, can be shown to have independent and largely deleterious effect on soil fertility, fresh water supplies, biodiversity, and atmospheric carbon.
Benjamin S. Arbuckle
The domestication of livestock animals has long been recognized as one of the most important and influential events in human prehistory and has been the subject of scholarly inquiry for centuries. Modern understandings of this important transition place it within the context of the origins of food production in the so-called Neolithic Revolution, where it is particularly well documented in southwest Asia. Here, a combination of archaeofaunal, isotopic, and DNA evidence suggests that sheep, goat, cattle, and pigs were first domesticated over a period of several millennia within sedentary communities practicing intensive cultivation beginning at the Pleistocene–Holocene transition. Resulting from more than a century of data collection, our understanding of the chronological and geographic features of the transition from hunting to herding indicate that the 9th millennium bce and the region of the northern Levant played crucial roles in livestock domestication. However, many questions remain concerning the nature of the earliest predomestic animal management strategies, the role of multiple regional traditions of animal management in the emergence of livestock, and the motivations behind the slow spread of integrated livestock husbandry systems, including all four domestic livestock species that become widespread throughout southwest Asia only at the end of the Neolithic period.
There are continuing developments in the analysis of hunger and famines, and the results of theoretical and empirical studies of hunger and food insecurity highlight cases where hunger intensifies sufficiently to be identified as famine. The varying ability of those affected to cope with the shocks and stresses imposed on them are central to the development of food insecurity and the emergence of famine conditions and to explaining the complex interrelationships between agriculture, famine, and economics. There are a number of approaches to understanding how famines develop. The Malthusian approach, which sees population growth as the primary source of hunger and famine, can be contrasted with the free market or Smithian approach, which regards freely operating markets as an essential prerequisite for ensuring that famine can be overcome. A major debate has centered on whether famines primarily emerge from a decline in the availability of food or are a result of failure by households to access sufficient food for consumption, seeking to distinguish between famine as a problem related to food production and availability and famine as a problem of declining income and food consumption among certain groups in the population. These declines arise from the interaction between food markets, labor markets and markets for livestock and other productive farm resources when poor people try to cope with reduced food consumption. Further revisions to famine analysis were introduced from the mid-1990s by authors who interpreted the emergence of famines not as a failure in markets and the economic system, but more as a failure in political accountability and humanitarian response. These approaches have the common characteristic that they seek to narrow the focus of investigation to one or a few key characteristics. Yet most of those involved in famine analysis or famine relief would stress the multi-faceted and broad-based nature of the perceived causes of famine and the mechanisms through which they emerge. In contrast to these approaches, the famine systems approach takes a broader view, exploring insights from systems theory to understand how famines develop and especially how this development might be halted, reversed, or prevented. Economists have contributed to and informed different perspectives on famine analysis while acknowledging key contributions from moral philosophy as well as from biological and physical sciences and from political and social sciences. Malthus, Smith, and John Stuart Mill contributed substantially to early thinking on famine causation and appropriate famine interventions. Increased emphasis on famine prevention and a focus on food production and productivity led to the unarguable success of the Green Revolution. An important shift in thinking in the 1980s was motivated by Amartya Sen’s work on food entitlements and on markets for food and agricultural resources. On the other hand, the famine systems approach considers famine as a process governed by complex relationships and seeks to integrate contributions from economists and other scientists while promoting a systems approach to famine analysis.
Dominic Moran and Jorie Knook
Climate change is already having a significant impact on agriculture through greater weather variability and the increasing frequency of extreme events. International policy is rightly focused on adapting and transforming agricultural and food production systems to reduce vulnerability. But agriculture also has a role in terms of climate change mitigation. The agricultural sector accounts for approximately a third of global anthropogenic greenhouse gas emissions, including related emissions from land-use change and deforestation. Farmers and land managers have a significant role to play because emissions reduction measures can be taken to increase soil carbon sequestration, manage fertilizer application, and improve ruminant nutrition and waste. There is also potential to improve overall productivity in some systems, thereby reducing emissions per unit of product. The global significance of such actions should not be underestimated. Existing research shows that some of these measures are low cost relative to the costs of reducing emissions in other sectors such as energy or heavy industry. Some measures are apparently cost-negative or win–win, in that they have the potential to reduce emissions and save production costs. However, the mitigation potential is also hindered by the biophysical complexity of agricultural systems and institutional and behavioral barriers limiting the adoption of these measures in developed and developing countries. This includes formal agreement on how agricultural mitigation should be treated in national obligations, commitments or targets, and the nature of policy incentives that can be deployed in different farming systems and along food chains beyond the farm gate. These challenges also overlap growing concern about global food security, which highlights additional stressors, including demographic change, natural resource scarcity, and economic convergence in consumption preferences, particularly for livestock products. The focus on reducing emissions through modified food consumption and reduced waste is a recent agenda that is proving more controversial than dealing with emissions related to production.
Edward B. Barbier
Globally, around 1.5 billion people in developing countries, or approximately 35% of the rural population, can be found on less-favored agricultural land (LFAL), which is susceptible to low productivity and degradation because the agricultural potential is constrained biophysically by terrain, poor soil quality, or limited rainfall. Around 323 million people in such areas also live in locations that are highly remote, and thus have limited access to infrastructure and markets. The households in such locations often face a vicious cycle of declining livelihoods, increased ecological degradation and loss of resource commons, and declining ecosystem services on which they depend. In short, these poor households are prone to a poverty-environment trap. Policies to eradicate poverty, therefore, need to be targeted to improve the economic livelihood, productivity, and income of the households located on remote LFAL. The specific elements of such a strategy include involving the poor in paying for ecosystem service schemes and other measures that enhance the environments on which the poor depend; targeting investments directly to improving the livelihoods of the rural poor, thus reducing their dependence on exploiting environmental resources; and tackling the lack of access by the rural poor in less-favored areas to well-functioning and affordable markets for credit, insurance, and land, as well as the high transportation and transaction costs that prohibit the poorest households in remote areas to engage in off-farm employment and limit smallholder participation in national and global markets.