At the global scale, conceptions of hunter-gatherer economies have changed considerably over time and these changes were strongly affected by larger trends in Western history, philosophy, science, and culture. Seen as either “savage” or “noble” at the dawn of the Enlightenment, hunter-gatherers have been regarded as everything from holdovers from a basal level of human development, to affluent, ecologically-informed foragers, and ultimately to this: an extremely diverse economic orientation entailing the fullest scope of human behavioral diversity. The only thing linking studies of hunter-gatherers over time is consequently simply the definition of the term: people whose economic mode of production centers on wild resources. When hunter-gatherers are considered outside the general realm of their shared subsistence economies, it is clear that their behavioral diversity rivals or exceeds that of other economic orientations. Hunter-gatherer behaviors range in a multivariate continuum from: a focus on mainly large fauna to broad, wild plant-based diets similar to those of agriculturalists; from extremely mobile to sedentary; from relying on simple, generalized technologies to very specialized ones; from egalitarian sharing economies to privatized competitive ones; and from nuclear family or band-level to centralized and hierarchical decision-making. It is clear, however, that hunting and gathering modes of production had to have preceded and thus given rise to agricultural ones. What research into the development of human economies shows is that transitions from one type of hunting and gathering to another, or alternatively to agricultural modes of production, can take many different evolutionary pathways. The important thing to recognize is that behaviors which were essential to the development of agriculture—landscape modification, intensive labor practices, the division of labor and the production, storage, and redistribution of surplus—were present in a range of hunter-gatherer societies beginning at least as early as the Late Pleistocene in Africa, Europe, Asia, and the Americas. Whether these behaviors eventually led to the development of agriculture depended in part on the development of a less variable and CO2-rich climatic regime and atmosphere during the Holocene, but also a change in the social relations of production to allow for hoarding privatized resources. In the 20th and 21st centuries, ethnographic and archaeological research shows that modern and ancient peoples adopt or even revert to hunting and gathering after having engaged in agricultural or industrial pursuits when conditions allow and that macroeconomic perspectives often mask considerable intragroup diversity in economic decision making: the pursuits and goals of women versus men and young versus old within groups are often quite different or even at odds with one another, but often articulate to form cohesive and adaptive economic wholes. The future of hunter-gatherer research will be tested by the continued decline in traditional hunting and gathering but will also benefit from observation of people who revert to or supplement their income with wild resources. It will also draw heavily from archaeology, which holds considerable potential to document and explain the full range of human behavioral diversity, hunter-gatherer or otherwise, over the longest of timeframes and the broadest geographic scope.
Christopher Morgan, Shannon Tushingham, Raven Garvey, Loukas Barton, and Robert Bettinger
Along with ceramics production, sedentism, and herding, agriculture is a major component of the Neolithic as it is defined in Europe. Therefore, the agricultural system of the first Neolithic societies and the dispersal of exogenous cultivated plants to Europe are the subject of many scientific studies. To work on these issues, archaeobotanists rely on residual plant remains—crop seeds, weeds, and wild plants—from archaeological structures like detritic pits, and, less often, storage contexts. To date, no plant with an economic value has been identified as domesticated in Western Europe except possibly opium poppy. The earliest seeds identified at archaeological sites dated to about 5500–5200 bc in the Mediterranean and Temperate Europe. The cultivated plants identified were cereals (wheat and barley), oleaginous plant (flax), and pulses (peas, lentils, and chickpeas). This crop package originated in the Fertile Crescent, where it was clearly established around 7500 bc (final Pre-Pottery Neolithic B), after a long, polycentric domestication process. From the middle of the 7th millennium bc, via the Balkan Peninsula, the pioneer Neolithic populations, with their specific economies, rapidly dispersed from east to west, following two main pathways. One was the maritime route over the northwestern basin of the Mediterranean (6200–5300 bc), and the other was the terrestrial and fluvial route in central and northwestern continental Europe (5500–4900 bc). On their trajectory, the agropastoral societies adapted the Neolithic founder crops from the Middle East to new environmental conditions encountered in Western Europe. The Neolithic pioneers settled in an area that had experienced a long tradition of hunting and gathering. The Neolithization of Europe followed a colonization model. The Mesolithic groups, although exploiting plant resources such as hazelnut more or less intensively, did not significantly change the landscape. The impact of their settlements and their activities are hardly noticeable through palynology, for example. The control of the mode of reproduction of plants has certainly increased the prevalence of Homo sapiens, involving, among others, a demographic increase and the ability to settle down in areas that were not well adapted to year-round occupation up to that point. The characterization of past agricultural systems, such as crop plants, technical processes, and the impact of anthropogenic activities on the landscape, is essential for understanding the interrelation of human societies and the plant environment. This interrelation has undoubtedly changed deeply with the Neolithic Revolution.
Kandace D. Hollenbach and Stephen B. Carmody
The possibility that native peoples in eastern North America had cultivated plants prior to the introduction of maize was first raised in 1924. Scant evidence was available to support this speculation, however, until the “flotation revolution” of the 1960s and 1970s. As archaeologists involved in large-scale projects began implementing flotation, paleoethnobotanists soon had hundreds of samples and thousands of seeds that demonstrated that indigenous peoples grew a suite of crops, including cucurbit squashes and gourds, sunflower, sumpweed, and chenopod, which displayed signs of domestication. The application of accelerator mass spectrometry (AMS) dating to cucurbit rinds and seeds in the 1980s placed the domestication of these four crops in the Late Archaic period 5000–3800 bp. The presence of wild cucurbits during earlier Archaic periods lent weight to the argument that native peoples in eastern North America domesticated these plants independently of early cultivators in Mesoamerica. Analyses of DNA from chenopods and cucurbits in the 2010s definitively demonstrated that these crops developed from local lineages. With evidence in hand that refuted notions of the diffusion of plant domestication from Mesoamerica, models developed in the 1980s for the transition from foraging to farming in the Eastern Woodlands emphasized the coevolutionary relationship between people and these crop plants. As Archaic-period groups began to occupy river valleys more intensively, in part due to changing climatic patterns during the mid-Holocene that created more stable river systems, their activities created disturbed areas in which these weedy plants thrive. With these useful plants available as more productive stands in closer proximity to base camps, people increasingly used the plants, which in turn responded to people’s selection. Critics noted that these models left little room for intentionality or innovation on the part of early farmers. Models derived from human behavioral ecology explore the circumstances in which foragers choose to start using these small-seeded plants in greater quantities. In contrast to the resource-rich valley settings of the coevolutionary models, human behavioral ecology models posit that foragers would only use these plants, which provide relatively few calories per time spent obtaining them, when existing resources could no longer support growing populations. In these scenarios, Late Archaic peoples cultivated these crops as insurance against shortages in nut supplies. Despite their apparent differences, current iterations of both models recognize humans as agents who actively change their environments, with intentional and unintentional results. Both also are concerned with understanding the social and ecological contexts within which people began cultivating and eventually domesticating plants. The “when” and “where” questions of domestication in eastern North America are relatively well established, although researchers continue to fill significant gaps in geographic data. These primarily include regions where large-scale contract archaeology projects have not been conducted. Researchers are also actively debating the “how” and “why” of domestication, but the cultural ramifications of the transition from foraging to farming have yet to be meaningfully incorporated into the archaeological understanding of the region. The significance of these native crops to the economies of Late Archaic and subsequent Early and Middle Woodland peoples is poorly understood and often woefully underestimated by researchers. The socioeconomic roles of these native crops to past peoples, as well as the possibilities for farmers and cooks to incorporate them into their practices in the early 21st century, are exciting areas for new research.
In 2018 barley accounts for only 5% of the cereal production worldwide, and regionally for up to 40% of cereal production. The cereal represents the oldest crop species and is one of the best adapted crop plants to a broad diversity of climates and environments. Originating from the wild progenitor species Hordeum vulgare ssp. spontaneum, biogeographically located in the Fertile Crescent of the Near East, the domesticated form developed as a founder crop in aceramic Neolithic societies 11,000 years ago, was cultivated in monocultures in Bronze Age Mesopotamia, entered the New World after 1492 ce, reached a state of global distribution in the 1950s and had reached approximately 200 accepted botanical varieties by the year 2000. Its stress tolerance in response to increased aridity and salinity on one hand and adaptability to cool climates on the other, partially explains its broad range of applications for subsistence and economy across different cultures, such as for baking, cooking, beer brewing and as an animal feed. Although the use of fermented starch for producing alcoholic beverages and foods is globally documented in archaeological contexts dating from at least the beginning of the Holocene era, it becomes concrete only in societies with a written culture, such as Bronze Age Mesopotamia and Egypt, where beer played a considerable role in everyday diet and its production represented an important sector of productivity. In 2004 approximately 85% of barley production was destined for feeding animals. However, as a component of the human diet, studies on the health benefits of the micronutrients in barley have found that it has a positive effect on blood cholesterol and glucose levels, and in turn impacts cardiovascular health and diabetes control. The increasing number of barley-breeding programs worldwide focus on improving the processing characteristics, nutritional value, and stress tolerance of barley within the context of global climate change.