1-12 of 12 Results  for:

  • Biological Anthropology x
Clear all


Adaptations to High-Altitude Hypoxia  

Cynthia M. Beall and Kingman P. Strohl

Biological anthropologists aim to explain the hows and whys of human biological variation using the concepts of evolution and adaptation. High-altitude environments provide informative natural laboratories with the unique stress of hypobaric hypoxia, which is less than usual oxygen in the ambient air arising from lower barometric pressure. Indigenous populations have adapted biologically to their extreme environment with acclimatization, developmental adaptation, and genetic adaptation. People have used the East African and Tibetan Plateaus above 3,000 m for at least 30,000 years and the Andean Plateau for at least 12,000 years. Ancient DNA shows evidence that the ancestors of modern highlanders have used all three high-altitude areas for at least 3,000 years. It is necessary to examine the differences in biological processes involved in oxygen exchange, transport, and use among these populations. Such an approach compares oxygen delivery traits reported for East African Amhara, Tibetans, and Andean highlanders with one another and with short-term visitors and long-term upward migrants in the early or later stages of acclimatization to hypoxia. Tibetan and Andean highlanders provide most of the data and differ quantitatively in biological characteristics. The best supported difference is the unelevated hemoglobin concentration of Tibetans and Amhara compared with Andean highlanders as well as short- and long-term upward migrants. Moreover, among Tibetans, several features of oxygen transfer and oxygen delivery resemble those of short-term acclimatization, while several features of Andean highlanders resemble the long-term responses. Genes and molecules of the oxygen homeostasis pathways contribute to some of the differences.


The Archaeology and History of Human Diseases in the Zimbabwean Past  

Pauline Chiripanhura, Ancila Katsamudanga, and Justen Manasa

Throughout history, communicable diseases have impacted humanity. If present experiences are any indication, diseases must have had significant impact on transforming the economic and social organization of past communities. Some aspects of what is regarded as normal modern human behavior must have emanated from responses to diseases, especially epidemics and pandemics. Unfortunately, few studies have been conducted in this area of archaeological investigations to shed more light on the influence of these on past communities. This is more so in African countries such as Zimbabwe where the history of pandemics stretches only as far as the beginning of colonialism, less than 200 years ago. Although the earliest world epidemic was recorded during the 5th century, it was not until 1918 that Zimbabwe recorded the first incidence of a worldwide epidemic. There is little knowledge on how precolonial communities were affected by global pandemics such as Black Death, the bubonic plague, and similar occurrences. It has to be noted that global pandemics became more threatening as society made the shift to agrarian life around 10,000 years ago. This has led many scholars to regard the adoption of agriculture as the worst mistake in the history of the human race as they argue that the creation of more closely connected communities gave rise to infectious diseases and presented these diseases with the chance to grow into epidemics. Diseases such as influenza, smallpox, leprosy, malaria, and tuberculosis are among those that have thrived since this shift. With its long human history, Africa is well positioned to shed light on the occurrence of global pandemics as well as their distinct impact on communities living in diverse social, economic, and natural environments. As such, it is important to explore the study of diseases, especially epidemics and global pandemics, to augment the worldwide knowledge generated from other continents. This knowledge should also be juxtaposed with what is already known about changing social, economic, and political developments to see the potential impacts that these pandemics had on the human past. The history of migration should be viewed as a potential history of the spread of new diseases. For all the known pandemics, the South African coast has served as the major corridor of transmission of disease pandemics into Zimbabwe. However, archaeologically, it is known that migrations were mostly over land from the northern and eastern regions. It is interesting to delve into how the spread of diseases could have differed when the movements of people over land, rather than coastal ports, are the nodes. Since there are few documentary sources to help in the comprehension of past outbreaks in the precolonial period, archaeological evidence becomes key. Without doubt, human skeletons represent the most ubiquitous source of information on ancient diseases. Zimbabwe has remains that stretch from the Stone Age to historical times. Paleopathology is an underdeveloped discipline in southern Africa, but with increased awareness of the possibilities of the presence of various diseases in prehistory, it is expected to grow.


Bioarchaeology in the Nile Valley  

Jenail H. Marshall and Michele Buzon

Bioarchaeology is the study of human remains within their archaeological and mortuary contexts. Bioarchaeologists use skeletal biology, mortuary practices, and the archaeological record to answer questions about past populations’ lives and lifestyles. The term Nile Valley defines a geographic region of Egypt and Nubia, the latter encompassing the region between the First Cataract at Aswan, Egypt, and the Sixth Cataract just north of Khartoum, Sudan. Spanning the Nile River area, it is sometimes referred to in its two parts, according to the river’s flow, from south to north, Lower Nubia in the north and Upper Nubia in the south. In Egypt, that is Upper Egypt in the south and Lower Egypt in the north. For over a century, the region has had many campaigns and salvage projects that have led to the excavation of thousands of skeletal remains from ancient Nile Valley sites. Analyses of these collections have provided important information about the people and their health, patterning disease and trauma, diet, and biological relationships. Early morphological research on the skeletal remains of the people who once lived in ancient Nubia was dominated by biased interpretations stemming from racist paradigms in the early 19th century that included racial typologies. Moving beyond these perspectives, contemporary research on the ancient Nile Valley has expanded methodological and theoretical advancements in bioarchaeology more broadly. The integration of bioarchaeology in the larger context of archaeological projects provides a wealth of information that includes but is not limited to health, disease, identity, nutrition, life experiences, and demographic patterns. Likewise, how archaeology is conducted in the region is shifting and highlights a move toward decolonial and ethical practices within the discipline, including involvement with the local communities.


Chronology of the Hominin Sites of Southern Africa  

Andy I.R. Herries

The identification of the Taung Child Australopithecus africanus type specimen as an early human fossil (hominin) by Raymond Dart in 1924, followed by key discoveries at sites like Sterkfontein, Swartkrans, and Makapansgat in the 1930s and ’40s, was key to understanding that humans first arose in Africa, not Europe or Asia. Later discoveries in eastern Africa have shown that the earliest potential hominins (e.g., Orrorin tugenensis) date back to at least 6 million years ago. In contrast, the oldest fossils hominins in South Africa are those of Australopithecus from the sites of Taung and Makapansgat and are dated to between about 3.0 and about 2.6 million years ago (Ma); only one specimen, from Sterkfontein, potentially dates to earlier than this sometime between 3.7 and 2.2 Ma. However, the majority of early hominin fossils in southern Africa come from 2.8- to 1.8-million-year-old palaeocave remnants in the Malmani dolomite of the Gauteng province. These sites have a rich record of hominin species, including Australopithecus africanus, Australopithecus sediba, Paranthropus robustus, and Homo erectus. Most of these species, except for Homo erectus, are endemic to South Africa. However, the DNH 134 specimen from Drimolen Main Quarry does represents the oldest fossil of Homo erectus anywhere in the world. This specimen occurs at a time around 2 Ma when there is a turnover in hominin species with the extinction of Australopithecus and the first occurrence of Homo, Paranthropus, and an archaeological record of Oldowan and bone tools. Acheulian technology occurs from at least 1.4 Ma and is associated with specimens simply attributed to early Homo. The oldest hominin fossil outside the northern Malmani dolomite karst is dated to between 1.1 and 1.0 Ma, at Cornelia-Uitzoek in the Free State, and also represents the last specimen defined as early Homo. Paranthropus is also last seen around 1 million years ago, when the first specimen attributed to Homo rhodesiensis may also have occurred at Elandsfontein in the Western Cape. There is a dearth of hominin fossils from the terminal Early Pleistocene until the late Middle Pleistocene when a high diversity of hominin species occurs between about 340,000 and about 240,000 years ago (c. 340 and c. 240 ka). This includes a late occurring specimen of Homo rhodesiensis from Broken Hill in Zambia, Homo helmei or early modern humans from Florisbad, and Homo naledi from Rising Star. This is also a period (post 435 ka) containing both late occurring Acheulian and early Middle Stone Age (MSA) technology, but none of these fossils is directly associated with archaeology. Definitively early modern human fossils are not found until after 180 ka in direct association with MSA technology, and the majority, if not all, of the record occurs during the last 120 ka.


Epigenetics and Applied Anthropology  

Charles H. Klein

Since Francis Crick and James D. Watson’s discovery of DNA in 1953, researchers, policymakers, and the general public have sought to understand the ways in which genetics shapes human lives. A milestone in these efforts was the completion of the Human Genome Project’s (HGP) sequencing of Homo sapiens’ nearly three million base pairs in 2003. Yet, despite the excitement surrounding the HGP and the discovery of the structural genetic underpinnings of several debilitating diseases, the vast majority of human health outcomes have not been linked to a single gene. Moreover, even when genes have been associated with particular diseases (e.g., breast and colon cancer), it is not well understood why certain genetically predisposed individuals become ill and others do not. Nor has the HGP’s map provided sufficient information to understand the actual functioning of the human genetic code, including the role of noncoding DNA (“junk DNA”) in regulating molecular genetic processes. In response, a growing number of scientists have shifted their attention from structural genetics to epigenetics, the study of how genes express themselves in particular situations and environments. Anthropologists play roles in these applications of epigenetics to real-world settings. Their new theoretical frameworks unsettle the nature-versus-nurture binary and support biocultural anthropological research demonstrating how race becomes biology and embodies social inequalities and health disparities across generations. Ethnographically grounded case studies further highlight the diverse epigenetic logics held by healthcare providers, researchers, and patient communities and how these translations of scientific knowledge shape medical practice and basic research. The growing field of environmental epigenetics also offers a wide range of options for students and practitioners interested in applying the anthropological toolkit in epigenetics-related work.


Hominin Taxic Diversity  

Bernard Wood, Dandy Doherty, and Eve Boyle

The clade (a.k.a. twig of the Tree of Life) that includes modern humans includes all of the extinct species that are judged, on the basis of their morphology or their genotype, to be more closely related to modern humans than to chimpanzees and bonobos. Taxic diversity with respect to the hominin clade refers to evidence that it included more than one species at any one time period in its evolutionary history. The minimum requirement is that a single ancestor-descendant sequence connects modern humans with the hypothetical common ancestor they share with chimpanzees and bonobos. Does the hominin clade include just modern human ancestors or does it also include non-ancestral species that are closely related to modern humans? It has been suggested there is evidence of taxic diversity within the hominin clade back to 4.5 million years ago, but how sound is that evidence? The main factor that would work to overestimate taxic diversity is the tendency for paleoanthropologists to recognize too many taxa among the site collections of hominin fossils. Factors that would work to systematically underestimate taxic diversity include the relative rarity of hominins within fossil faunas, the realities that many parts of the world where hominins could have been living are un- or under-sampled, and that during many periods of human evolutionary history, erosion rather than deposition predominated, thus reducing or eliminating the chance that animals alive during those times would be recorded in the fossil record. Finally, some of the most distinctive parts of an animal (e.g., pelage, vocal tract, scent glands) are not likely to be preserved in the hominin fossil record, which is dominated by fragments of teeth and jaws.


Implications of Ancient DNA for Understanding Human Evolution  

Sloan R. Williams

Ancient DNA (aDNA) studies have significantly changed anthropological perceptions of human evolution. The caves where many of the Eurasian archaic hominin remains have been found are particularly well suited for study as DNA preserves best at low temperature and humidity. This research has provided important information about our close relatives, the Neanderthals, and the Denisovans, a group previously unknown in the fossil record. Scientists have found traces of these, and perhaps other archaic hominin groups, in modern human genomes. A small but significant admixture or genetic exchange occurred among these groups with Neanderthal alleles being present in modern European and Asian people, while Denisovan alleles are confined to Asia and Oceania. No one knows exactly what happened to these archaic groups, but hybrid incompatibility, pathogen resistance, and population dynamics may have contributed to their disappearance. The ancient genetic sequences found in living people are not distributed uniformly throughout their genomes. Negative selection against archaic alleles may explain “archaic deserts” where these alleles are rare. In areas where archaic alleles are more common, both positive selection and genetic drift could explain these higher frequencies and why they are often difficult to distinguish from each other. Current work indicates that archaic alleles that confer resistance to disease and strengthen immune function are likely candidates for positive selection. Archaic alleles involved in phenotypic skin color variation are likely relatively neutral and more likely provide examples of genetic drift. Medically focused research has just begun to reveal the positive and negative effects that archaic alleles may have on human health. Human adaptation is ongoing, so genetic information obtained from archaeological contexts should provide otherwise unobtainable information about health and disease. The aDNA field will continue to grow as the technologies improve permitting access to genetic sequences from ever older samples and expanding the preservation conditions that can yield DNA.


Modern Human Behavior  

Pamela R. Willoughby

In evolutionary terms, a modern human is a member of our own species, Homo sapiens. Fossil skeletal remains assigned to Homo sapiens appear possibly as far back as 300,000 or 200,000 years ago in Africa. The first modern human skeletal remains outside of that continent are found at two sites in modern Israel, the Mugharet es Skhūl and Jebel Qafzeh; these date between 90,000 and 120,000 years ago. But this just represents a short, precocious excursion out of Africa in an unusually pleasant environmental phase. All humans who are not of direct sub-Saharan African ancestry are descended from one or more populations who left Africa around 50,000 years ago and went on to colonize the globe. Surprisingly, they successfully interbred with other kinds of humans outside of Africa, leaving traces of their archaic genomes still present in living people. Modern human behavior, however, implies people with innovative technologies, usually defined by those seen with the earliest Upper Paleolithic people in Eurasia. Some of these innovations also appear at various times in earlier African sites, but the entire Upper Paleolithic package, once known as the Human Revolution, does not. Researchers have had to split the origin of modern biology and anatomy from the beginnings of modern cultural behavior. The first clearly evolves much earlier than the latter. Or does it?


The OsteoDontoKeratic Culture  

Patrick Randolph-Quinney and Anthony Sinclair

The Osteodontokeratic (ODK for short) is a technological and cultural hypothesis first proposed by Raymond A. Dart in 1957, based on fossils recovered from the South African cave site of Makapansgat. Dart proposed that the extinct hominin species Australopithecus prometheus were predatory, cannibalistic meat eaters, and specialized hunters. He suggested that they manufactured and used a toolkit based on the bones (osteo), teeth (donto), and horns (keratic) of prey animals, and that these first tools were evidence for the “predatory transition from ape to man” as a distinct stage in human evolutionary development. Dart based his hypothesis on the analysis of bones of fossil ungulates and other prey species found at Makapansgat. The parts of the skeleton recovered from the cave were biased toward the skull and limb bones, whilst the thorax, pelvis, and tail were largely absent, indicating a selection agent at work. The bones also exhibited evidence of damage, which Dart suggested could only have been caused by intentional violence. Many of the bones were blackened, which he suggested was due to burning or charring in a controlled fire. In his mind, the hominins of Makapansgat were prodigious hunters who used organic tools to kill their prey, whereupon they cooked and ate the meat, discarding waste bone but utilizing some of the skeletal material to make new tools. Dart developed a detailed typology of complete or modified bones that he indicated could be used as clubs, projectiles, daggers, picks, saws, scoops, and cups—in doing so, he confused form with function. Dart and the ODK were championed by the American playwright Robert Ardrey across four hugely successful popular science books starting with African Genesis in 1961. Following Dart, these books portrayed our early ancestors as aggressive hunters killing prey and each other, driven by a need to protect their territory. This concept infiltrated popular culture through the opening sequence of Stanley Kubrick’s 2001: A Space Odyssey released in 1968, making the ODK perhaps the most famous scientific claim for an original form of human technology. Dart’s hypothesis was not widely accepted by contemporary scientists such as Kenneth Oakley, Sherburn Washburn, John Robinson, and C. K. “Bob” Brain, and led Brain to conduct his own field research on the agents of fossil accumulation and site formation processes in South Africa. Brain later demonstrated that the pattern of bone damage and skeletal part representation recorded by Dart at Makapansgat was the result of nonhuman modification, particularly accumulation and dietary processing of ungulate carcasses by large carnivores such as leopard or hyena. Furthermore, the blackening of bone was caused by manganese mineral staining. In testing and falsifying the ODK hypothesis, Brain and fellow researchers laid the experimental groundwork for the discipline of vertebrate taphonomy (the laws of burial and postmortem processes) which is now a cornerstone in paleolithic archaeology and the study of early human origins. It is debatable whether this scientific specialism would exist in its present form without Dart’s claims for the ODK.


Plant Use  

Anna Maria Mercuri

Plant use is a familiar word pair that emphasizes how the great wealth of properties and characters of different botanical species has allowed humans to develop different aspects of their culture. On one hand, plants communicate chemically with each other; on the other hand, their wealth of chemical communication tools has attracted humans, who are interested in colors and smells, taste and food, fuel, wellness, and health. The traces of plants buried in archeological sites—the subject of archeobotany—allow us to reconstruct the steps of the relationship between humans and plants. Surprisingly, the study of botanical remains from the past shows complex uses since the very early stages of human cultures, dating back even before the beginning of the Holocene. The relationship with the environment was structured in forms of increased control, at least from the invention of agriculture (as something that had never existed before) onward, undertaking complex forms of exploitation in accordance with the different cultures in the different regions of the world. In more recent times, people and plants have also progressively developed a history of greater management and interdependence, including the development of agricultural landscapes, selection of domesticated species, and creation of gardens. The relationship with plants changes as society changes, leading to the loss of much knowledge in present times because of less connection and contact with nature. Knowledge and conservation of traditions dealing with plants are studied with ethnobotany, which explores plant use in the present day. Ecology for ecosystem services is the newest perspective on plant use, where perhaps trees return to play a key role in human existence without being cut down, and the green color of chlorophyll returns as a reassuring signal to the human species.


Primate Conservation  

Stacy Lindshield and Giselle M. Narváez Rivera

While anthropological primatology is known for its basic research on understanding the human condition from comparative and evolutionary perspectives, its applied and practicing domains are equally important to society. Applied researchers and practitioners often work in the fields of environmental sustainability and conservation, biomedicine, captive care and management, and education. For sustainability and conservation specializations, primatologists seek careers in higher education, government, and nongovernmental organizations and may work in large and diverse teams on conservation and management problems for nonhuman primates (hereafter, termed primates). Primate conservation has largely focused on population monitoring in protected and unprotected areas; measuring effects of agriculture, extractive industries, and tourism on primates; and evaluating intervention strategies. Primate population management in urban and peri-urban areas is a growth area; these landscapes pose risks for primates that are absent or rare in protected areas, which include dog attacks, animal–vehicle collisions, and electrocutions. Anthropologists can leverage their deep knowledge of primate behavior, cognition, and ecology as part of interdisciplinary teams tasked with environmental mitigation in these human-centered landscapes. One example of this work is the use of arboreal crossing structures for primates to move safely through forests fragmented by roads. Primate conservationists recognize that environmental sustainability extends beyond conservation. For instance, primates may create public health problems or nuisances for local communities in cases where they are potential disease vectors. While these circumstances lead some people to view primates as pests, in a subset of these cases, cultural norms and values prohibit culling (i.e., killing or otherwise removing from a population) as a management strategy. Primate conservationists working on these issues may integrate human perspectives and attitudes toward primates in localized intervention or mitigation programs aimed at environmental sustainability and/or natural resource management. More than half of the world’s nonhuman primate species are threatened with extinction, and this problem is mostly a modern and global phenomenon related to unsustainable land use. Primates enhance many societies through providing ecosystem services, enriching cultural heritage, and advancing scientific research. It is for these reasons that primatologists often contribute to conservation programs in protected areas. Protected areas are designed to allow wildlife to flourish in spaces by restricting land use activities, but the history of protected areas is fraught with social injustices. Such areas are often but not always associated with higher biodiversity than adjacent and unprotected spaces. People and primates have shared spaces since time immemorial, often in sustainable ways. In addition, allocating a majority of primate range areas to fortress-style protection is at odds with the economic growth models of some primate range countries (i.e., nations with indigenous wild primates). Furthermore, many primatologists recognize that conservation benefits from integrating social justice components into programs with the ultimate goal of decolonizing conservation. Primate conservation continues to build on the foundation of basic and applied research in protected areas and, further, contributes to the development of community conservation programs for environmental sustainability. Examples of these developments include participating in offset and mitigation programs, introducing ethnographic methods to applied research to evaluate complex social processes underlying land use, and contributing to the decolonization of primate conservation.


Water Security and Scarcity  

Amber Wutich, Melissa Beresford, Teresa Montoya, Lucero Radonic, and Cassandra Workman

Anthropological thinking on water security and scarcity can be traced through four scholarly approaches: political ecology of water scarcity, water insecurity, water economics, and human-water relationality. Political ecologists argue that water scarcity a sociopolitical process and not necessarily related to physical water availability. The political ecological approach is concerned with power, global-local dynamics, and how water scarcity is unevenly distributed within and across communities. Water insecurity research is concerned with how injustice and inequity shape household and individual variability in water insecurity. Inspired by biocultural research, water insecurity scholars have used systematic methods to advance theories of how water insecurity impacts mental health, food insecurity, dehydration, and other human biological outcomes. Economic anthropologists explore how economic dynamics—including formal and capitalist economies, noncapitalist and hybridized economies, reciprocity, social reproduction, and theft—shape water scarcity and insecurity. Research priorities in economic anthropology include water valuation, meanings of water, and water as an economic good. Building from Indigenous scholars’ insights, relational approaches argue that humans have reciprocal obligations to respect and care for water as a living being. Water justice, these scholars argue, requires restoring human-water relations and upholding Indigenous sovereignty and self-determination. All four of these research areas—scarcity, insecurity, economics, and relationality—are producing cutting-edge research, with significant implications for research agendas in the anthropology of water security and scarcity.