The Middle Stone Age (MSA) is a period of African prehistory characterized by the production of flake-based assemblages, often with a focus on stone points and blades using prepared core reduction techniques. The MSA follows the Earlier Stone Age and precedes the Later Stone Age, although the boundaries between these periods are not as sharp as originally defined. The MSA is generally regarded as having started by at least three hundred thousand years ago (ka) and lasted until roughly forty to twenty thousand years ago. Identifying the chronological limits for the MSA is challenging because some aspects of MSA technology are found in assemblages outside this time range that also have Earlier or Later Stone Age-type tools.
The earlier part of the MSA is associated with fossils belonging to the Homo sapiens clade (alternatively referred to as Homo heidelbergensis, Homo rhodesiensis, or archaic Homo sapiens). The later part of the MSA post 200 ka is associated with Homo sapiens. Determining the processes underlying the anatomical evolution of Homo sapiens during the MSA is a major aim of ongoing research, however fossil remains are rare.
Across the African continent and through time, the MSA exhibits a high degree of variability in the types of stone tools that were manufactured and used. Archaeologists have used this variability to define several technocomplexes and industries within the MSA that include, but are not limited to, the “Aterian,” “Howiesons Poort,” “Still Bay,” and “Lupemban.” Variation in point styles, presumably hafted to wooden handles or in some cases projectiles, is considered a hallmark of the regional diversification that originates in the MSA. This variability, which is temporally and spatially restricted, differs in both degree and kind from the preceding Earlier Stone Age.
The MSA is significant from an evolutionary perspective because, in addition to being associated with the anatomical origins of Homo sapiens, this period in time documents several significant changes in human behavior. Populations in the MSA practiced a foraging economy, were proficient hunters, and began efficiently and systematically utilizing aquatic resources such as shellfish and freshwater fish for the first time. Other significant changes include the elaboration of and increased reliance on symbolic resources and complex technologies. For example, the first known externally stored symbols in the form of crosshatched incised pigments date to ~100 ka. In contexts of similar age, shell beads for making jewelry have been recovered from Morocco and South Africa. The earliest evidence for complex projectiles dates to at least 74 ka. The meaning, utility, and persistence of symbols and complex technologies depend on social conventions and confer advantages in contexts that involve long-distance, complex social networks. While many of these earliest finds linked to behavioral modernity have so far been geographically restricted, the combined suite of genetic, fossil, and archaeological evidence may better support a polycentric African origin for Homo sapiens over the course of the MSA.
Article
Kendra A. Sirak, Elizabeth A. Sawchuk, and Mary E. Prendergast
Ancient DNA has emerged as a powerful tool for investigating the human past and reconstructing the movements, mixtures, and adaptations that have structured genetic variation throughout human history. While the study of genome-wide ancient human DNA was initially restricted to regions with temperate climates, methodological breakthroughs have now extended the reach of ancient DNA analysis to parts of the world with hot and humid climates that are less conducive to biomolecular preservation. This includes Africa, where people harbor more genetic diversity than can be found anywhere else on the planet, reflecting deep and complex population histories. Since the first ancient African genome was published in 2015, the number of individuals with genome-wide data has increased to nearly 200, with greater coverage of diverse geographical, temporal, and cultural contexts. Ancient DNA sequences have revealed genetic variation in ancient African foragers that no longer exists in unadmixed form; illuminated how local-, regional-, and continental-scale demographic processes associated with the spread of food production and new technologies changed genetic landscapes; and discerned notable variation in interactions among people with distinct genetic ancestries, cultural practices, and, likely, languages. Despite an increasing number of studies focused on African ancient DNA, multiple regions and time periods have yet to be explored. Research to date has primarily focused on the past several thousand years in eastern and southern Africa, setting up northern, western, and central Africa, as well as deeper time periods, as key areas for future investigation.
As ancient DNA research becomes increasingly integrated with anthropology and archaeology, it is advantageous to understand the basic methodological and analytical techniques, the types of questions that can be investigated, and the ways in which the discipline may continue to grow and evolve. Critically, the growth and evolution of ancient DNA research must include attention to the ethics of this work, both in African contexts and globally. In particular, it is essential that research is conducted in a way that minimizes the potential of harm to both the living and the dead. Scientists conducting ancient DNA research in Africa especially must also contend with structural challenges, including a lack of ancient DNA facilities on the continent, the extensive fragmentation of African heritage (including ancient human remains) among curating institutions worldwide, and the complexities of identifying descendant groups and other stakeholders in the wake of colonial and postcolonial disruptions and displacements. Ancient DNA research projects should be designed in a way that contributes to capacity building and the reduction of inequities between the Global North and South to ensure that the research benefits the people and communities with connections to the ancient individuals studied. While ensuring that future studies are rooted in ethical and equitable practices will require considerable collective action, ancient DNA research has already become an integral part of our understanding of African population history and will continue to shape our understanding of the African past.
Article
Southern Africa’s past five thousand years include significant shifts in the peopling of the subcontinent. Archaeological approaches tend to characterize this period following these changes. This includes the appearance of herding and food production on a landscape that only hosted hunting and gathering, the arrival of new and competing worldviews and settlements systems, the local development of complex and state-level society that involved multiple groups, the arrival and eventual colonization of the region by European settlers, and the segregation, imbrication, articulation, and creolization of various identities. As part of studying this phase, quite often it is viewed as a series of “wholes” that share space and time. These “wholes” are usually identity groups: foragers, herders, farmers, or colonists.
While regularly kept separate, archaeological remains and historic records more often indicate inter-digiting and fluid social entities that interacted in complex ways. However, the past is frequently constructed around rigid concepts of people that usually reflect contemporary groups to some extent. Understanding past identities is historically contingent and rooted in contemporary approaches, methods, and frameworks. This is no different in the mid- to late Holocene in southern Africa, which also involves the construction of pasts and people associated with non-colonial communities. The role of identity in how the past is formed has played a significant role in building sequences, interpreting material culture, and assigning change to migrations and movements within the subcontinent. Archaeologists regularly grapple with issues involving identity that include the influence of colonial writings, the impact of social contacts, and the relationship between past and present people. Taxonomizing the archaeological past by following ethnic groups and subsistence practices has led to intense and frequent discussion and debate. The nature of identity, however, is hard to define and relinquish from the influence of Western ontologies of being and community. Archaeologists are therefore forced to orientate themselves betwixt and between the past and the present to more accurately reflect people.
Article
Erich Fisher
Computational and digital technologies have fundamentally transformed archaeological practice.
Archaeologists routinely use computers and the internet for digitally recording, archiving, displaying, and communicating archaeological knowledge and ideas. Many governmental and funding agencies even stipulate that primary data acquired through grant funding now must be made publicly accessible through digital data archives.
Archaeoinformatics is the study of computational and digital technologies to analyze, archive, and disseminate archaeological records and the locations, contexts, and characteristics of the materials that embody those records. The strength of archaeoinformatics, though, is not in the ubiquitous use of computers or other digital technologies; it is the integrative framework that these technologies provide to create intrinsically interdisciplinary studies of complex archaeological problems. This integrative framework is sustained by adapting knowledge and methods from other disciplines. As a result, archaeoinformatics specialists are often skilled at traversing disciplinary boundaries, and archaeoinformatics, therefore, can be considered a unifying science that bridges disciplines via a digital platform allowing researchers to tackle complex research questions using multipronged research strategies.
Article
Geeske Langejans, Alessandro Aleo, Sebastian Fajardo, and Paul Kozowyk
An adhesive is any substance that bonds different materials together. This broad definition includes materials used in everything from hafted stone tools to monumental architecture. In addition, the combination of bonding, plasticity, and insolubility meant that some adhesives were exploited for waterproofing and sealing of materials, as self-adhering inlays and putties, and as paints, varnishes, and inks. Adhesives have a history of at least 200,000 years. Throughout (pre)history and around the world, people used materials, including bitumen/asphalt, carbohydrate polymers such as starches and gums, natural rubbers, mortars, proteins (from casein, soy, blood, and animal connective tissue), insect and plant resins, and tars made from various barks and woods. Adhesives thus are very diverse and have widely varying properties: they can be tacky, pliable, elastic, brittle, water-resistant, fluid, viscous, clear, dark, and much more. They are a plastic avant la lettre. These properties can and were tweaked by mixing ingredients or by further processing. In the study of archaeological adhesives, their characterization is essential and this is best done with chemical and spectroscopic methods. When larger coherent samples as opposed to single finds are analyzed, adhesive studies can provide data on past technologies, socioeconomic organizations, and environments and raw material availability. Through sourcing and mapping of ingredients and adhesive end products, travel and transfer of materials and knowledge can be illuminated. Additionally, experimental reproductions provide data on technological aspects that otherwise are lost in the archaeological record. An archaeology of adhesives can reveal the transport networks, subsistence, mobility strategies, division of labor, and technological know-how that held societies together.
Article
Archaeologies of the recent and contemporary world represent a relatively young movement within Africa. Rather than being conceived as relative to a particular chronology, this movement is often characterized as concerned with investigating the practice of archaeology itself, especially its politics and its understanding of time. The small but growing body of literature in this subfield is reviewed both to highlight a moment of disciplinary innovation and to reflect on what modifications of methodology, ethics, and theory are necessary to adapt an intellectual movement developed in other parts of the world for the African continent. These include an emphasis on foregrounding African knowledge systems, especially diverse experiences of time and materiality; the potential for co-creation of data through relationships between these and Western ways of knowing; and mixed research methods. Themes such as time, materiality, and reflexivity are considered in contexts across the continent, as well as where archaeologies of the contemporary world overlap or exist in tension with related moves in cognate African Studies fields.
Article
Pauline Chiripanhura, Ancila Katsamudanga, and Justen Manasa
Throughout history, communicable diseases have impacted humanity. If present experiences are any indication, diseases must have had significant impact on transforming the economic and social organization of past communities. Some aspects of what is regarded as normal modern human behavior must have emanated from responses to diseases, especially epidemics and pandemics. Unfortunately, few studies have been conducted in this area of archaeological investigations to shed more light on the influence of these on past communities. This is more so in African countries such as Zimbabwe where the history of pandemics stretches only as far as the beginning of colonialism, less than 200 years ago. Although the earliest world epidemic was recorded during the 5th century, it was not until 1918 that Zimbabwe recorded the first incidence of a worldwide epidemic. There is little knowledge on how precolonial communities were affected by global pandemics such as Black Death, the bubonic plague, and similar occurrences. It has to be noted that global pandemics became more threatening as society made the shift to agrarian life around 10,000 years ago. This has led many scholars to regard the adoption of agriculture as the worst mistake in the history of the human race as they argue that the creation of more closely connected communities gave rise to infectious diseases and presented these diseases with the chance to grow into epidemics. Diseases such as influenza, smallpox, leprosy, malaria, and tuberculosis are among those that have thrived since this shift. With its long human history, Africa is well positioned to shed light on the occurrence of global pandemics as well as their distinct impact on communities living in diverse social, economic, and natural environments. As such, it is important to explore the study of diseases, especially epidemics and global pandemics, to augment the worldwide knowledge generated from other continents. This knowledge should also be juxtaposed with what is already known about changing social, economic, and political developments to see the potential impacts that these pandemics had on the human past. The history of migration should be viewed as a potential history of the spread of new diseases. For all the known pandemics, the South African coast has served as the major corridor of transmission of disease pandemics into Zimbabwe. However, archaeologically, it is known that migrations were mostly over land from the northern and eastern regions. It is interesting to delve into how the spread of diseases could have differed when the movements of people over land, rather than coastal ports, are the nodes. Since there are few documentary sources to help in the comprehension of past outbreaks in the precolonial period, archaeological evidence becomes key. Without doubt, human skeletons represent the most ubiquitous source of information on ancient diseases. Zimbabwe has remains that stretch from the Stone Age to historical times. Paleopathology is an underdeveloped discipline in southern Africa, but with increased awareness of the possibilities of the presence of various diseases in prehistory, it is expected to grow.
Article
Ahmed Adam and Shadia Taha
Sudan is a vast country marked by heterogeneity, dissimilarities, and diversities in its climates, topography, natural features, cultures, and people. Sudan’s multiplicity of cultures and communities is steeped in history and heritage as remarkable as anywhere else in the ancient world and the rest of Africa. Despite this, Africa’s heritage has been overlooked for centuries as a result of prejudice and stereotyping. The 19th and 20th centuries were characterized by supposition and fixation on an external origin of African civilizations, a focus that was based on European ethnocentrism and a sense of racial superiority. In common with the rest of Africa, archaeology was founded during the colonial period and, to a large extent, remained unchanged, retaining past management and interpretative approaches and influencing current practices and planning policies. Sudan’s rich and outstanding heritage, the home of the first great civilizations in Sub-Saharan Africa, was frequently overlooked. When discussing the civilizations of the Nile Valley, many historians and archaeologists focus entirely on the role of Egypt. Ancient civilizations in Sudan were constantly interpreted as the work of colonizers and were believed to be less advanced than Egyptian civilizations. The building of the Aswan High Dam threatened the lives of Nubians and their heritage. It necessitated the forced displacement of Nubian and Bushareen nomadic tribes from their homelands and submerged considerable heritage. Nonetheless, this was the first time an organized survey was undertaken in Sudanese Nubia. The rescue campaign provided archaeological evidence and replaced ethnic prehistory with new theories.
Archaeology in Sudan underwent a dreadful experience throughout the thirty years it was under the governance of the ousted dictatorial regime. The government in power in 1989–2019, an autocratic rule with a different political ideology, took control over Sudan’s heritage. Along with an oil boom, fast modernization, urbanization, and unrest in the country, all these factors had a tremendous impact on archaeology and heritage and on the operation of the National Corporation for Antiquities and Museums (NCAM). Moreover, the military forces, which used archaeological sites as military bases, took control over and demolished significant heritage and disconnected local communities from their heritage.
From the 1980s, the number of native archaeologists and departments of archaeology increased. This period witnessed an expansion in research projects, themes, topics, periods, methods, and regions explored by Sudanese and foreign teams. There is a move away from focusing on single sites to understanding and exploring past environments and landscapes using new scientific methods of investigation. There are multiple challenges ahead, including climate change (flooding, destratification, shifting sands), globalization, mega-developments, lack of sufficient funding and resources, and, most recently, Covid-19. These are complex issues to deal with, especially for poor counties. Development and unrest in Sudan continue to force communities to move from their homelands and threaten the loss of traditional knowledge, diversity of culture, and connectedness with the land.
Article
Liza Gijanto
Considered within the broader corpus of studies of food and foodways, feasting in African archaeological contexts has not been reported to the same degree as in other world areas. The reasons for this could be a genuine lack of feasting practices in African contexts as well as the focus on feasts as empowering events in more hierarchical societies. Where feasting has been identified, it is done with the aid of documentary or oral sources. Most of these studies are focused on locations and time periods of interoceanic trade in west and east Africa. Feasting has been identified in these contexts by utilizing multiple lines of material evidence, including ceramics, fauna, and items such as pipes related to leisure activities that would be part of a large celebration. In some, evidence is limited to location and fauna.
Article
Emuobosa Akpo Orijemie
Food production constitutes a way of life that involves the management of plant and animal resources and related products to ensure food security for the sustenance of a society. Primary data in support of this culture was drawn primarily from archaeobotanical (micro and macro) and linguistics data from the region. Food production began with the hunting of game and the gathering and management of diverse wild fruit trees and vegetables. The strategy adopted during the early period was, or was synonymous with, garden-based agroforestry. Garden-based agroforestry, a process that involves the management of wild fruit trees and vegetables as well animals, is a deliberate and conscious anthropogenic modification of the immediate environment of a people to achieve food security. This strategy was indigenous to the region, began before the practice of a cereal-based subsistence, and did not require environmental changes or forest crises to stimulate its existence. The diversity of the archaeological data and the regularity of occurrence indicate that the peoples experienced significant levels of food security in the past.
Article
Luca Maria Olivieri
The cultural context in which the term “Gandhāra” is used initially refers to Vedic geography and then to the administrative limits of the homonymous Achaemenid satrapy.
The most reliable information referring to the Middle Holocene period, in which the Gandhāran region must have met a climatically optimal phase during which domesticated rice was introduced to Kashmir and Swat through the trans-Himalayan corridors (early 2nd millennium bce or earlier). Toward the end of the 2nd millennium, northern Gandhāra features a rather coherent settlement phenomenon marked by large graveyards, mainly with inhumations, which were labeled by previous scholarship as the “Gandhāra Grave Culture” (1200–900 bce). In this phase among the major cultural markers, the introduction of iron technology is noteworthy.
The historic phases in Gandhāra are marked by an initial urban phase in Gandhāra (500–150 bce), sometimes referred to as a “second urbanization,” on the evidence mainly from Peshawar, Charsadda I, Barikot, and Bhir Mound (Taxila I). Mature urban phases (150 bce–350 ce) are defined based on the restructuring of old cities, and new urban foundations during the phases of contact historically defined by the Indo-Greek and Śaka dynasties, followed by the Kushans (Peshawar, Charsadda II, Barikot, Sirkap, or Taxila III). The artistic phenomenon known as the Buddhist “art of Gandhāra” started toward the end of the 1st century bce and lasted until the 4th century ce. The beginning of this art is best attested in that period in Swat, where schist of exceptional quality is largely available. At the beginning of the 1st century ce, the iconic and figurative symbols of Indian Buddhism acquire a narrative form, which is the major feature of the Buddhist art of Gandhāra. The subsequent art and architecture of Buddhist Gandhāra feature large sanctuaries richly decorated, and monasteries, documented in several “provinces” of Gandhāra throughout the Kushan period, from the late 1st century ce to mid/end-3rd century ce. In this period Buddhist sanctuaries and urban centers developed together, as proved both in Peshawar valley, in Swat, and at Taxila.
After the urban crisis (post-300 ce)—which went hand in hand with the crisis of the centralized Kushan rule—stratigraphic excavations have so far registered a significant thinning of the archaeological deposits, with a few exceptions. Besides coins deposited in coeval phases of Buddhist sanctuaries and literary and epigraphic sources, archaeological evidence for the so-called Hunnic or “Huna” phases (c. 5th–7th century ce) are very scarce.
Around the mid-6th century, Buddhist monasteries entered a period of crisis, the effects of which were dramatically visible in the first half of the 7th century, especially in the northern regions of Gandhāra. It is after this phase (early 7th century) that literary sources and archaeology report the existence of several Brahmanical temples in and around Gandhāra. These temples were first supported by the Turki-Śāhi (whose capital was in Kabulistan; end-7th/early 8th century) and then by the Hindu-Śāhi (9th–10th century).
Article
Namita Sanjay Sugandhi
The term “Hindu” derives from Persian expressions coined in the 4th century bce to define the traditions found east of the Indus River. Thus, a common start to the archaeological examination of Hinduism are the prehistoric cults found in various regions of the Indian subcontinent. Some elements associated with traditions from the urban Indus civilization of the 3rd millennium bce have been connected to later Hindu iconography and ideals, but these links remain tenuous. By the mid-2nd millennium bce, the introduction of new Vedic ideologies, so called because the earliest references are found in the texts of the Vedas, ushered in significant transformations in ritual and spiritual life, but left little material trace. However, migrating groups associated with these traditions have been traced genetically and linguistically to the Western Steppes of Central Asia. Over the next two thousand years, Vedic traditions became more elaborate and heterogeneous, merging with popular customs, and generating heterodox schools of thought that challenged both the spiritual and social order of Brahmanical Hinduism, which also took form during this time. The early centuries of the Common Era were witness to additional transformations and adaptations, and it is after this period that various forms of temple architecture, sculpture, and the epigraphic record become a wider body of evidence for study in both South and Southeast Asia. During the 1st millennium ce, Hinduism took on more familiar contours, partly driven by the rise in extant religious, philosophical, and secular literature. Alongside this textual record, a wealth of architectural and art historical sources became available; studies of these sources increasingly look to continuities from earlier eras that are documented archaeologically. Nevertheless, much of this body of knowledge derives from institutional and elite contexts; household-level details remain slim and much contemporary interpretation of past daily worship continues to be inferred from the ethnographic record. During the modern period, Hinduism came to acquire its formal definition as a world religion, and with this came the attempt to delineate Hindu identity for first colonial, and then national ends, often in tandem with the Orientalist archaeologies of the early and mid-20th century. Though the definition of modern Hinduism may be more clearly circumscribed, it is certainly no less varied. Modernity continues to impact the understanding of Hinduism in many ways. Technologies such as DNA analysis have been applied to the study of early societies, with the goal of understanding ancient migrations and the composition of different regional populations. While our understanding of past human movement has increased considerably because of these studies, genetics do not serve as a proxy of culture. DNA evidence can provide some details about the movement and interaction of different populations in the past, but categories like race, language, and culture are as incommensurable as they are artificial, and they should be understood as such. Instead of a match for the textual or genetic record, the archaeology of Hinduism should be considered the material study of a broad amalgam of dynamic beliefs and practices that date back into the eras of earliest prehistory and continue to transform and evolve around the world.
Article
Chioma Ngonadi
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Anthropology. Please check back later for the full article.
Archaeological research began relatively late in southeastern Nigeria compared with other African countries. The site of Igboukwu, despite the remarkable discoveries made there accidentally in 1938, was not investigated thoroughly until 1959. The first systematic archaeological excavations in the region took place between December 1959 and January 1960. The Igboukwu excavations yielded hundreds of glass beads, intricately produced bronze objects, elaborately decorated potsherds, and various iron tools that revealed the artistic ingenuity of the Igbo people. These archaeological findings laid a good foundation for archaeological research in southeastern Nigeria. Subsequently, from 1964 to 1978, human-made tools including hand axes, flakes, cores, polished stone axes, ground stone axes, and microliths were discovered at various locations in the region. At the Lejja, Opi, and Aku iron smelting sites, evidence of slag blocks, tuyere fragments, furnace remains, iron ores, and potsherds are seen on the surface, suggesting large-scale intensive iron-working production in the past. These archaeological remains from stratified archaeological deposits showcase a people with a distinctive past.
Article
Johanna A. Pacyga
The archaeology of missionization in colonial Senegambia is a nascent area of study within the broader historical archaeology of colonialism that explores the historical processes of evangelization and conversion as they were experienced by Senegambian converts. Senegambia was a prominent target of Catholic and Protestant missionaries throughout the 19th and 20th centuries. Archaeology is a uniquely situated discipline for expanding our understanding of missionization beyond the historical and anthropological perspectives because—through its focus on material remains—it uncovers the experience of proselytization and conversion from the ground up by illuminating the daily lives of mission residents who are often underrepresented in archival sources: African converts themselves, including women and children. The archaeology of missionization exposes lines of evidence that have left behind a robust footprint of religious and institutional architecture, landscape elements, and material culture accessible through archaeological survey and excavation. Furthermore, missionization was deeply rooted in the materiality of everyday life, so it is not simply because mission sites exist that they should be excavated, but because missionaries widely considered material practices to be integral to the broader conversion process. The archaeology of missionization interrogates the relationship between the theory and practice of evangelization during the period of colonization, and reveals the lived experience of religious conversion among Senegambian mission residents, both neophytes and those who did not embrace Christianity.
Article
Natalie Swanepoel
The late 18th and early 19th centuries in Europe and the United States saw a wave of evangelical revivalism and hence the establishment of a large number of missionary societies who dispersed missionaries throughout the globe. Southern Africa was viewed as a potentially fruitful mission field and, as a result, a large number of mission stations were established in the region during the 19th century under the auspices of a wide array of missionary societies, although there are some examples of missionization prior to this. Missionary activity in southern Africa has long been the topic of academic investigation by historians and others but was only sporadically so by archaeologists until the second decade of the 21st century, when a critical mass of mission archaeology projects was ongoing to the extent that there was collaboration and discussion among the scholars concerned. As a result, in the early 21st century, it became an acknowledged focus of southern African historical archaeology. In their study of missions, missionaries, and missionization, archaeologists draw on a diverse toolkit of methodologies, including mapping, landscape survey, geophysical survey, excavation, artifact analysis, rock art analysis, museum collections analysis, and the comparative study of documents, pictorial records, and the archaeological record. Archaeologists have contributed by placing mission sites into their wider landscapes; exploring changing material practices in architecture, clothing, household goods, and burial practices; and studying missionary activity and mission sites in diachronic perspective.
Article
Elinaza Mjema
Archaeological research on natural disasters has increased significantly since the 1970s, with archaeologists paying more attention to the potential cultural effects of natural disasters. In the 21st century, archaeological investigations of natural disasters have become more sophisticated, and researchers have produced substantial literature on the topic. In Eurasia and the Americas, archaeological studies increasingly invoke natural disasters as the cause of socioeconomic transformations in past societies. In East Africa, however, few archaeological studies have yet considered the impact of natural disasters on local communities. As media coverage and research on natural disasters increases globally, East African archaeology is beginning to contribute to the discussion. Preliminary works in East Africa have applied disaster-study basic concepts to investigate ancient natural disasters that befell early coastal communities in the area. Researchers studying the Pangani Bay on the northeast Tanzanian coast, for example, have deduced from archaeological and geological evidence that ocean-originating floods caused the destruction of an early Swahili village there a thousand years ago. Researchers in this new field of study are focusing on the relationships between natural disasters (floods), their cultural impacts, and human responses to them. Disaster archeology focused on East Africa is expected to increase significantly because such research may provide historical records (including strategies people employed to cope with extreme natural events in the past) to inform researchers and policymakers dealing with extreme natural-event impacts in the 21st century.
Article
Elena A.A. Garcea
The Aterian is a North African late Middle Stone Age techno-complex. It is spread from the Atlantic coast in Morocco to the Middle Nile Valley in Sudan and from the Mediterranean hinterland to the Southern Sahara. Chronologically, it covers the period between c. 145,000 years bp and 29,000 bp, spanning across discontinuous, alternating dry (end of MIS 6 and MIS 4) and humid (MIS 5 and MIS 3) climatic phases. Few, but significant human remains indicate that the makers of the Aterian complex belong to early Homo sapiens. Their osteological features show affinities with the early anatomically modern human record in the Levant (Skhul and Qafzeh), suggesting that Aterian groups may have taken part in the initial dispersals out of Africa by Homo sapiens. Toolkits consist of a variety of implements not only made of stone but also of bone (points, spatulas, knives, and retouchers). They include tools that were lacking in earlier or other North African contemporary contexts, namely bifacial foliates, blades, perforators, burins, endscrapers, and particularly tanged pieces. Overemphasis on tanged tools often obscured the complexity of the Aterian, which instead displays a wide range of cultural and behavioral innovations. New mobility patterns and intra-site organization, as well as early symbolism with the use of Nassariidae shells and ochre, corroborate early fully complex behavior by these populations. Given the broad geographic and chronological extension of the Aterian, differences are evident at both local and regional scales. They suggest the development of a flexible and variable techno-complex mirroring considerable adaptive cognitive and behavioral plasticity derived from nonlinear processes. Such diversified behavioral experiments result from multiple and noncumulative trajectories due to different internal and external stimuli but are still part of a single cultural entity.
Article
Jenail H. Marshall and Michele Buzon
Bioarchaeology is the study of human remains within their archaeological and mortuary contexts. Bioarchaeologists use skeletal biology, mortuary practices, and the archaeological record to answer questions about past populations’ lives and lifestyles. The term Nile Valley defines a geographic region of Egypt and Nubia, the latter encompassing the region between the First Cataract at Aswan, Egypt, and the Sixth Cataract just north of Khartoum, Sudan. Spanning the Nile River area, it is sometimes referred to in its two parts, according to the river’s flow, from south to north, Lower Nubia in the north and Upper Nubia in the south. In Egypt, that is Upper Egypt in the south and Lower Egypt in the north. For over a century, the region has had many campaigns and salvage projects that have led to the excavation of thousands of skeletal remains from ancient Nile Valley sites.
Analyses of these collections have provided important information about the people and their health, patterning disease and trauma, diet, and biological relationships. Early morphological research on the skeletal remains of the people who once lived in ancient Nubia was dominated by biased interpretations stemming from racist paradigms in the early 19th century that included racial typologies. Moving beyond these perspectives, contemporary research on the ancient Nile Valley has expanded methodological and theoretical advancements in bioarchaeology more broadly.
The integration of bioarchaeology in the larger context of archaeological projects provides a wealth of information that includes but is not limited to health, disease, identity, nutrition, life experiences, and demographic patterns. Likewise, how archaeology is conducted in the region is shifting and highlights a move toward decolonial and ethical practices within the discipline, including involvement with the local communities.
Article
Justin Bradfield
Bone, like other organic materials, featured prominently in the technological repertoires of most historically documented hunter-gatherer communities practising a Stone Age economy. Unlike stone, however, bone does not survive as well archaeologically, resulting in less attention generally being paid to this aspect of material culture. Yet, despite their poorer preservation, bone tools are found in several hominin sites dating to the last two million years in South and East Africa, where two regionally distinct varieties of bone tool occur. Traceological analyses (which comprise use-wear, fracture, and residue analyses) have gone a long way in elucidating the functions of these tools and those from younger periods.
Deliberately modified bone tools are found sporadically at archaeological sites dating throughout the last two million years, but never in large numbers. Bone tools offer us many insights into past cultures and now-vanished technologies. For example, insect extraction, musicality, basket weaving, and garden agriculture were all expressed through the medium of bone. These bone artefacts often constitute the sole evidence for such technologies and their associated behaviors. To this list might be added bow-and-arrow technology, although here there is plenty of confirmatory evidence from lithic and residue studies.
Despite their ubiquitously fewer numbers, bone tools are no less important for understanding aspects of the past than their lithic counterparts and have been the focus of several anthropological debates. The degree of similarity in manufacturing techniques, finished product morphology, and decorative motifs have led some researchers to extrapolate similarities in overarching cultural traditions. But the same similarities are seen in other parts of the world. Even a recurrence of decorative motifs may mean different things to different people at different times. The presence of well-made bone tools in Iron Age sites continues to be seen as evidence for trade between hunter-gatherers and farmers. But without concrete evidence that the bone tools moved from one place to another, such facile interpretations only serve to underplay farmer agency. Apart from trying to work out function, bone tool studies globally are focused on identifying the specific animal species selected to make tools and what such selection strategies might reveal about the symbolic importance of animals in human societies.
Article
Nicolas Nikis
Copper, considered a “red gold,” had a major place in the political economy of Central Africa over the past two millennia. Copper was a rare resource. Its ore was only accessible in a few scattered locations in Central Africa, especially the Copperbelt in southeast Central Africa and the Niari basin in the south of Republic of Congo. Until the massive imports of European alloys beginning in the 16th century, only unalloyed and leaded copper objects were produced and used in Central Africa. The first instance of copper smelting in the region is dated around the 5th century ad, much later than for iron, and it has been mainly used over time as a means of exchange, for jewelry, and as material for artworks and decoration of objects. Different techniques have been used over time and space to produce the metal and manufacture the objects, some of them closely related to iron metallurgy. Smelting took place close to the deposits, and diverse processes relating to sociohistorical factors have been identified. Ingots, produced on the smelting sites, were one of the preferred forms for exchange, acquiring in some cases symbolic and/or monetary value. Manufacturing objects could take place far from the smelting place. Because copper and brass can easily be recycled, metal regularly changed shape to fit local needs and tastes.
From the late 1st millennium ad, copper has been exchanged over increasingly long distances in regional networks and, eventually, traded to the Indian and Atlantic Ocean coasts. Rising polities, such as the Kongo Kingdom in the 15th century, would have benefited from access to this resource. More broadly, copper was regularly associated with the expression of power and wealth but was also accessible to a large number of people. In addition to the economic value of copper, metalworking and the figure of the smith were closely associated with power. Copper’s physical properties such as color and brightness were also important in its choice as a material for artworks as a way to support and enhance the role of the object.