1-19 of 19 Results

  • Food Studies x


Anorexia, Bulimia, and the Embodiment of Capitalist Consumer Culture  

Alice Weinreb

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. Industrial capitalism has been linked with the 20th-century rise of eating disorders in two major ways, one school of thought focusing on the late 19th-century emergence of anorexia nervosa, the other emphasizing the dramatic explosion in diagnoses that took place in the 1970s and 1980s. The first interpretation is by scholars of the late 19th century. Literary theorists of modernism have linked modernist aesthetics and textual strategies with anorexia nervosa. They highlight persistent themes of asceticism, sleekness, pared-down-ness, disgust with the body, and self-starvation, as well as a distinct aesthetic of language use, suggesting that modernist writers reject words just as anorexics reject food. Historians of this era, in turn, have focused on the rise of a Victorian model of bourgeoisie femininity, which idealized frailty and self-denial in terms of both gender and class. The explosion in eating disorders during the second half of the 20th century inspired a quite different way of conceptualizing the relationship between industrial capitalism and eating disorders. In this analysis, developed under the auspices of second-wave feminism and most famously articulated in philosopher Susan Bordo’s influential essay “Anorexia Nervosa: Psychopathology as the Crystallization of Culture,” eating disorders expressed the relationship between consumer culture and women’s bodies. In this interpretation, postwar capitalism specifically was cast as the cause of anorexia and bulimia. The analysis hinged upon the paradoxical meaning of consumption in postwar capitalism, which was seen as the cause of, and also as symbolized by, the deadly self-denial of the anorexic and the irrational gorging and purging of the bulimic. Eating disorders thus expressed the specifically gendered and destructive impacts of late modern capitalism on the female body, combining the demand for unbridled consumption and individual empowerment with expectations of female self-denial and physical smallness.


Calorie Counting  

Nina Mackert

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. In the 21st century, calorie counts are ubiquitous in weight-loss advice, but historically they are a fairly recent phenomenon. In the late 19th century, a transatlantic coalition of researchers and reformers began to understand food as energy for human motors that could help to optimize human and social productivity, even on a global scale. It was not until the early 20th century that calorie counting was introduced into weight-loss diets. This contributed to the emergence of fat shaming by suggesting that an individual’s excess body weight was the direct and causal result of eating more than their caloric needs, placing the responsibility for their body and health in the hands of that individual. Recent diagnoses of “the death of the calorie” confirm that the history of the calorie is a political one, in that they perpetuate the calorie’s legacy of framing weight loss as a capability of self-governing citizens.


Social and Biocultural Dimensions of Children’s Food  

Tina Moffat

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Children’s food cannot be defined without historical and cultural context. Apart from breastmilk, infant formula, and prepared weaning foods, “children’s food” did not exist until the early 20th century with the rise of industrialized food. Although children with rapid growth and development need, on average, more nutrients per volume compared to adults, they do not require special foods per se. At the beginning of the 20th century, with nutrition sciences’ discovery of vitamins and rampant malnutrition among the poor, food manufacturers convinced parents they should feed their children supplements in the form of foods such as cod liver oil and yeast cakes. This later evolved into the marketing of foods like breakfast cereals—highly palatable, sugar-sweetened food products that simultaneously had a halo of health due to their nutrient fortification. The rise in child obesity in the 2000s prompted attention to unhealthy food products and advertising that marketed them specifically to children. Although well established in the field of public health, social sciences’ interest in children’s food did not begin until the 1990s with the broader recognition that studies of children were largely absent. It was about this time that food studies began to address the social, cultural, and economic dimensions of children’s food in a globalizing world. Despite the shift, to date, there are surprisingly few studies of cross-cultural and social dimensions of children’s food. In the works that do exist, researchers explore the effects of specific economic contexts such as the one-child policy in China, the influence of French and American cultures on caregiver attitudes to children’s food, and how nourishment and care are influenced by neoliberal policies. School food and the state’s involvement have received more attention with several historical accounts of the US National School Food Program and case studies of global school food programs. More common is research on child food insecurity, driven by the continuing urgency of this problem into the 21st century. This field is still dominated, however, by global health studies; there remains a dearth of social studies of the lived experiences of food insecurity from children’s perspectives. This may be due to the challenges of doing research with children who can be difficult to engage, in part because of their definition as vulnerable persons. While there are barriers to doing research with children about children’s food, it is an emerging field with both academic and applied benefits.


Class Mobility and Occupational Change  

Alex Korsunsky

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Racial hierarchies have defined US agriculture from the beginning, structuring access to land and imposing stark social boundaries between farmers and farmworkers. Farmworkers have generally been those whose racialized identities excluded them both from other economic opportunities and from the full protection of the law, which largely—but not entirely—curtailing opportunities for upward mobility within agriculture. In California and in other West Coast states, some Asian immigrant farmworkers in the late 19th and early 20th centuries succeeded in establishing themselves as orchardists or truck farmers, growing labor-intensive fresh produce for nearby urban markets. As agribusiness increasingly shifted toward Mexican farm labor in the early 20th century (a trend that accelerated and expanded geographically from the Second World War onward), Mexican workers’ immigration status as Bracero guestworkers or undocumented migrants often kept them mobile and isolated from surrounding communities. By the late 1970s and 1980s, Mexican farmworkers were increasingly settling permanently, most notably in California, where agricultural intensification created new demand for year-round labor. Many of these farmers operated as sharecroppers and tenant farmers, drawing on family labor to work the same labor-intensive crops that had been favored by earlier Asian immigrant farmers (particularly strawberries). Tenant farming and sharecropping arrangements offered limited autonomy, and displaced risk from landowners onto immigrant farmers. Alongside these precarious farmworker-to-farmer transitions, settled farmworkers also experienced opportunities for economic and social mobility by transitioning from seasonal to year-round jobs, occupying more specialized and responsible roles (pesticide applicator, irrigation specialist, foreman), founding contracting businesses, and establishing nonfarm businesses to supply growing immigrant communities. Reliable, systemic, quantitative data on the number of farmworkers becoming farm owners and/or operators is not available, making it difficult to track the growth of this phenomenon, which appears in the scholarly literature primarily in regional studies, most often focused on California. While numbers have ebbed and flowed along with commodity and agricultural real-estate markets, the number of Mexican ex-farmworkers farming in the United States seems to have grown significantly since the 1980s in places where farmworkers have had opportunities to settle permanently, especially among documented immigrants, including beneficiaries of the 1986 Immigration Control and Reform Act’s amnesty provisions for agricultural workers. From the late 2010s onward, scholars noted the rise in farmworker-to-farmer transitions among Mexican immigrants beyond the West Coast, and identified farmworker-to-farmer transitioners as a promising source of talent amid the general aging of the US farm population. These scholars emphasized immigrant farmers’ specialization in mixed-vegetable production and direct-to-consumer sales, lower use of synthetic inputs, and noncommercial motivations as indications that their cultural backgrounds and negative experiences in agroindustry led them to more sustainable, “alternative” forms of agriculture. However, many of these features are also explicable as results of undercapitalization, exclusion from mainstream markets, and lack of secure access to adequate farmland. Alternative and nonprofit-affiliated growers likely receive scholarly and media attention disproportionate to their numbers within the farmworker-to-farmer population, further complicating efforts to characterize these farmers’ approach to agriculture.


Colonial Era Food and Spice  

Amanda E. Herbert

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Growth of the British colonial system also meant big changes to British diets and to the spice and flavor of British food. Britain’s actions in invading, colonizing, and settling the Atlantic, Pacific, and Indian Ocean worlds gave British people, both at home and abroad, access to many new ingredients. In 1972, historian Alfred W. Crosby identified the early modern global exchange of animals, plants, and— crucially and tragically—diseases and pathogens, as the “Columbian Exchange,” a cataclysmic biological moment instigated by Christopher Columbus, in which separate biomes came into sustained contact for the first time. Scholars have since expanded the scope of this study, with Judith Carney, Edda Fields-Black, and Jessica B. Harris, among many others, drawing critical attention to the fact that the Columbian Exchange was a global phenomenon, and that while this was a moment of wonder and curiosity for some, it meant utter devastation for many others. In this globalizing early modern world, the rapid and widespread movement of people, plants, and animals changed the ways that British Atlantic people flavored their food. Using ingredients from the Americas, cooks gained access to vanilla, chili, and new kinds of palm oil. From the African continent, they learned of peppermint, cottonseed oil, coffee, and sesame. And from Asia, cinnamon, ginger, pepper, nutmeg, and mace became staples in British Atlantic kitchens. Some of these spices and flavoring agents would have been familiar—cinnamon had been traded across the continents of Europe, Africa, and Asia for hundreds of years—but others would have been novel. And all of them were now much more widely available, and were combined, altered, and adapted in fresh ways. For women and men in the British Atlantic world, this meant that foods like “pickled mango,” an entirely mango-free dish spiced with ginger, cinnamon, nutmeg, and chilis, now had a spot at the table. It also meant that many different kinds of people, including Black women and men, Indigenous communities, and white settler-colonizers, seasoned their foods in new ways. Made possible by access to global markets, facilitated by invasion and colonisation, and undergirded by enslavement, British Atlantic foodways were both piquant and experimental.


Culinary Tourism  

Alicia Kennedy

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Culinary tourism has been defined as “the intentional, exploratory participation in the foodways of an other.” The culinary tourist is understood as an individual who actively constructs meaning in their experience through an aesthetic appreciation of food. When one travels, one eats, but this does not always mean partaking in culinary tourism. By definition, culinary tourism entails an intentional and exploratory aspect. It has become a popular way of interacting with the world, whether through actually travelling, using social media to learn about foodways, cooking dishes from cultures other than one’s own, or watching food television. Going to a grocery store in a new area is also considered culinary tourism, according to the definition set forth by Lucy M. Long in Culinary Tourism. The multifaceted nature of culinary tourism requires a multidisciplinary approach that draws from anthropology, media studies, history, and food studies. In the 21st century, food has been used as a tool of national soft-power interests—chiefly in places such as Denmark and Peru—and the enticement of the culinary tourist has been part of this political work. A more contemporary understanding of culinary tourism would explore the use of food as a tool of soft power, where desire and exploration are somewhat democratized by social media but political-economic power and Western interests still dominate the ways in which a cuisine is understood as worthy of intentional, exploratory participation.


The Ethics of Veganism and Plant-based Diets  

Carlo Alvaro

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. Humans have been consuming animal flesh and other animal products for at least 2.6 million years. Still today, most human beings eat meat and animal products. Ethical veganism is often touted by many as an ever-increasing lifestyle, an assertion often supported by the exponential growth of the plant-based food market. After almost fifty years of animal rights activism, and with the current popularity of plant-based products, one would expect to see an increase in the percentage of people who follow vegan diets. Regrettably, the devil is in the details. The actual number of people who consume vegan diets is not necessarily increasing. Ethical veganism is not necessarily growing, but rather devolving. For example, while there is no exact figure for the percentage of people who follow vegan diets in the US as of 2023, data from various sources shows that in the U.S., 4% identify as vegetarian, and only a (disappointing) 1% as vegan. Moreover, many countries are increasing meat and animal product consumption. In the U.S., per capita, meat consumption continues to be among the highest in the world.



Azita Chellappoo

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. Research into the social and cultural dimensions of body size has been rapidly growing, including within the interdisciplinary field of fat studies. Fat studies scholars contest and critique mainstream ideas about body size or weight, including the view that “obesity” is fundamentally pathological or inherently unhealthy, or that individuals can control their body size. Researchers have also challenged negative stereotypes and highlighted the harms of moralizing and stigmatizing discourses about larger bodies. Research within this area often draws upon ideas produced within fat liberation activist movements, including rejecting the term “obesity” as overly medicalizing and pathologizing, and instead reclaiming “fat” and “fatness” as neutral descriptors. Research has identified and characterized the pervasiveness of anti-fatness (weight stigma, or anti-fat discrimination) within and across societies. Anti-fatness has interpersonal and structural dimensions, both of which can have far-reaching effects on fat people’s lives, including effects on health outcomes. Scholars have traced the historical emergence of anti-fatness, the connection between anti-fatness and racial hierarchy, and the ways in which anti-fatness contributes to the moralization of food and the prevalence of dieting practices. Additionally, policies and clinical guidelines developed to address the “obesity epidemic,” such as calorie labeling and the use of body mass index (BMI), have been a target of critical examination. Researchers have suggested that these interventions can function as potentially harmful practices of control and surveillance that reinforce anti-fat attitudes and structures. Similarly, frameworks and understandings of food environments, food justice, and health have been evaluated in terms of the ways in which these discourses can continue to problematize fat bodies and contribute to discrimination or marginalization. There has been increasing recognition of the harms of anti-fatness in fields such as public health, which has led to attempts to move away from stigmatizing language and interventions, and to reject placing responsibility on individuals. However, these attempts have been criticized for continuing to medicalize and pathologize fatness, and therefore continuing to perpetuate harm. As the language and policies around “obesity” or fatness shift, in part driven by the introduction of novel weight-loss interventions such as Ozempic, research into social attitudes toward fatness and the experiences of people with larger bodies continues to evolve.


Food and Religious Rituals  

Jonathan Brumberg-Kraus

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Food rituals, whether articulated intentionally or performed unconsciously in our biologically necessary acts of eating, do nothing less than construct and maintain people’s fundamental relationships in the world and define who or what they are in it. In that sense, it might be said that all food rituals are religious, although that depends on very specific definitions of ritual and religion. One should distinguish between rituals in the weak sense (habitual patterned behaviors performed unconsciously) versus rituals in the strong sense (performed with explicit, conscious intention as in the work of J. Kripal). However, all rituals are performances of myths, that is, the basic stories people live by, whether practicing them makes people’s intentions explicit. Food rituals are “religious” in that they govern and express the fundamental relationships people have in the cosmos: who or what they eat, with whom they eat, and for whom they are “food.” Food rituals create and sustain worldviews, and so are all fundamentally “religious” or “religion-like.” To distinguish between the way critical comparative scholars of religion use the terms religion and religious and their use in common parlance, it makes sense to underline that “religious food rituals” normally refers to food rituals in the strong sense. Thus, religious food rituals often involve specific words or scripts (eating and talking, eating and reading), as well as other nonverbal cues and modes of paying attention: music, costumes, special props, accentuated or exaggerated gestures, and designated authoritative officiants. For example, the Jewish Passover seder, Christian communion and Lenten fasting, Aztec human sacrifice, Muslim observance of halal rules and Ramadan fasting, Jain or Buddhist vegetarianism, and many forms of Hindu puja are rituals in the strong sense. Examples of food rituals in the weak sense are secular veganism, shopping for food in grocery stores, WeightWatchers dieting, or eating meals in a breakfast–lunch–dinner sequence (Mary Douglas). These rituals imply certain assumptions about people’s relationships to animals and plants, capitalist consumer culture, ideals of beauty and well-being, and identification with special social groups (e.g., family, national cultures, geographic regions). In other words, they too are enactments of the stories people live by.


Genocide and Food in Postcolonial Narratives  

Jonathan Bishop Highfield

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] Two of the five acts defined as genocide by the United Nations Genocide Convention of 1948 are causing serious bodily or mental harm to members of the group and deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part. The material erasure of foodways and food systems under colonialism and the representational erasure of those same foodways and food systems from the historical record serve as genocidal elements designed to destroy the culture of colonized populations. Descriptions of land management, food preparation, and shared meals in colonial memoirs and postcolonial narratives can uncover fissures in established histories and reveal past and ongoing resistances against those erasures. Examining narratives from around the Indian Ocean world reveals a common colonial imperative to alter people’s relationship with food, the persistent effects of that colonial intervention, and endeavors by artists, writers, and storytellers to undo the colonial constructions.


The History and Roots of Tea  

Matthew Mauger

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. Tea’s modernubiquity as a drink imbibed and (increasingly) cultivated around the world belies its origin as a plant – typically one of two varieties of the camellia sinensis – grown, harvested, and prepared for consumption in various south-east Asian countries for millennia. In the late-twentieth and early twenty-first centuries, historians with interests across the fields of trade, botany, and cultural studies, have become increasingly interested in this remarkable transition, and the perspectives it affords on global histories of labour, imperialism, mechanisation, consumption, production, and transculturation (to name but a few). Tea’s foodways are both ancient, associated with cultural practices and origin stories found across the countries in which it flourished as a component of local flora, and profoundly modern. It is the ultimate convenience product mass-produced and packaged in the form of cheap tea bags and bottles of the soft drink known as ‘ice tea’. Indeed, as a product of international trade since the early-sixteenth century, tea has shown an astonishing ability to transform and redefine itself. European travellers to China and Japan first encountered tea in the mid-sixteenth century, and it was probably first imported into Europe in small quantities at around this time by Portuguese traders active in the area around Macau. In Britain, where drinking tea became recognised as a domesticated component of national behaviour by the early-nineteenth century, tea was first advertised for sale in the late 1650s. Across the eighteenth century, it increasingly became the focal point of the lucrative European ‘East India’ trade, and its taxation as an article of consumption encouraged the formation of violent smuggling networks. During the same period, a regular overland ‘tea road’ was established between China and Russia, a caravan trade that was to persist until the mid-nineteenth century. In the colonies of North America, tea became in the 1780s a focal point of the movement for independence, culminating in a series of protests remembered in national mythology as ‘the Boston Tea Party’. The increasingly widespread practice of drinking tea with sugar also connected British consumption with the sugar plantations of America and the Caribbean, where sugar was grown and harvested by enslaved people from Africa and elsewhere. British nineteenth-century dominance of the trade, together with its imperial ambitions in India and beyond, led to the establishment of a tea monoculture over vast tracts of land in India, Sri Lanka, and various African countries such as Malawi, Kenya, and Zimbabwe, that was further extended in the colonial infrastructure of other European nations. The emergence of these tea plantations leveraged both the development of intensive practices of farming and mechanisation (which were to shape global tea production in the twentieth century), and the consolidation of ownership and production by multi-national corporations which continue to dominate the tea trade into the twenty-first century.


The History of Cookbooks  

Henry Notaker

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] The history of cookbooks describes the development of an old literary genre with an explosive growth from the last part of the twentieth century. Cookbooks are primarily collections of culinary recipes, written instructions often based on earlier oral communication. The cookbooks are handwritten, printed, or digitized in various forms on the internet. Most interest has been given to printed cookbooks, first published in Italy, France, and Germany in the fifteenth century and later spread globally. These books may build on local traditions, but many of them are translations from foreign languages, adapting advanced technology to local cuisine. The cookbook belongs to the handbook genre within nonfiction literature and has certain characteristics in composition, structure, literary style, format, typeface, design, and illustrations, features interesting for the student of book history, bibliographical science, and literature. The authors of the earliest printed books were men, many of them the printers or booksellers who published the books, but women took gradually over in northern Europe and the United States from the eighteenth century and in southern Europe only in the twentieth century. Most cookbooks include recipes for all sorts of culinary products, but there are also special books on one particular foodstuff, one particular type of dish, and special diets such as vegetarian, vegan, paleolithic, kosher, and halal. Cookbooks are important sources for the development of culinary traditions but also for any historical study. Apart from the practical instructions, cookbooks contain statements and references to social status, health, local produce, manners and customs, religion, taste, and aesthetics.


Indian Ocean Trade  

Jeremy A. Simmons

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] The Indian Ocean, with its annual monsoons, has served as an arena for human movement and the conveyance of distinct foodstuffs for millennia. Climatological factors give rise to numerous distinct hydroclimates, ranging from the Nile River Valley to the Yemeni highland and the vast watershed of the Himalayas. These environments supported the domestication, cultivation, and redistribution of multiple staples, including wheat and rice. Spices, alongside aromatics and organic dyes, were among the most popular items of trade, for example, black peppercorns, cardamom, ginger, cinnamon, and cassia. Littoral populations throughout the Indian Ocean world (often derided as ichthyophages, or “fish-eaters”) relied on local fisheries for their own sustenance and commodities of trade—not only the daily catch of fishermen but also corals and pearls collected by divers and foragers. The early modern period heralded a distinct change. For one, foodways radically changed with the introduction of plant species from the Americas and the formation of a wider “Indo-Atlantic” world. Although the acquisition of spices motivated Europeans to establish colonial footholds throughout the Afro-Asian world, increased interest in and (forced) cultivation of stimulants emerged to sustain the changing tastes of consumers, for example, sugar, coffee, and tea.


Living Fermented Foods and Drinks  

James Read

Fermentation is the process by which microbes transform ingredients into a palatable product or ferment. Its customary uses are food preservation (as in sauerkraut) and alcohol production (as in wine or beer), though it is also highly regarded for flavor enhancement and health benefits. Research into fermentation is multidisciplinary, covering fields ranging from history and anthropology to microbiology and nutrition. Fermentation has been intentionally employed as a preparation technique through which microbes and humans have domesticated each other for at least 13,000 years, across cultures spanning from Japan (miso) to Mexico (tepache). It is central to many foods and drinks, but when referring to “ferments” in a culinary context, most people do not mean bread, beer, or olives but, rather, the likes of kimchi, kombucha, and kefir. These could broadly be termed as “living ferments,” as they have at least the potential to contain live and active microbes when they arrive on our plate. This quality of vitality is not only merely useful for classification but also for indication of how they are made (such that there is no inherent pasteurization or dehydration to create the final product). To categorize ferments further, they can be grouped as vegetables (such as kimchi, sauerkraut, pao cai, and gundruk), no/low-alcohol drinks (such as tepache, tejuino, atole agrio, kombucha, and juniper beer), dairy (such as yogurt, kefir, dahi, amasi, and tätmjölk), and soybeans (miso, soy sauce, douchi, koji, doubanjiang, gochujang, tempeh, natto, and meju). Ferments can be made either by inoculation with a starter culture or by spontaneous (or wild) fermentation. In either case, competition and collaboration within the microbial community results in different species occupying their own niches, some of which are crafted for them through fermentation techniques and some of which the microbes develop for themselves. This adaptation is mirrored on a coevolutionary scale, as humans and microbes have made homes for and of each other.


Mezze and the Lebanese Table  

Aïda Kanafani-Zahar

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. The study of mezze in Lebanon is based on anthropological fieldwork conducted in rural and urban contexts, in the mountains and along the littoral. It explored the foundations of this tradition and prompted the analysis of the Lebanese cuisine. One of the most appreciated culinary traditions for the Lebanese people is to partake in a restaurant-prepared meal that includes an assortment of appetisers, grilled meats or the day’s special, and dessert. Mezze is generally understood as the diversity of hot and cold vegetable and meat dishes served along with the main dish. In a particular type of meal, the gourmet meal, however, mezze is the prelude to the main dish, generally consisting of grilled meats followed by fruits and sweets, the three courses articulating in a continuous and fluid movement. Enjoyed in renowned establishments to celebrate festive events or simply to share an agreeable moment with family and friends, the gourmet meal is destined to seek a particular emotion, kaif. Mezze is composed of a large number of starters brought to the table in groups of dishes appearing one after the other in a precise order. Each group displays distinct ingredients, flavours, modes of cooking and temperatures. Guests indulge in the dishes as they come along. The wide range of starters shapes a specific time sequence and devises a pattern of eating, that of “tasting”, of “savouring”. The study of the sensorial complexity regarding its sequencing, scenography, and gustatory structure, reveals the representations of appetence in the culinary Lebanese culture. It provides insight into the visual requirements that a gourmet table must display and the flavour hierarchy aimed at kindling appetite and at preserving it throughout the meal. Further, it brings to awareness the categories of foods that necessitate the application of precise techniques to neutralise odours that cause inappetence. These elements offer the opportunity to examine the culinary sensibility expressed in the Lebanese culture. Notwithstanding the differences in organisation, time span, table manners, and purpose, the gourmet meal, in which mezze is embedded, possesses the same logic as the ordinary daily table: to stimulate and maintain the desire to eat. Both types of meals epitomise a sense-sustained culture of appetence designed at averting ill eating and sickness, a realm of research that merits investigation.


Natural Food  

Michael Kideckel

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] The subject of scholarly research, marketing lawsuits, and governmental hearings, the question of what makes food natural or unnatural has spawned controversy for generations. A concept intertwined with what people choose to buy and eat, many people desire “natural food,” but all people struggle to define it. The term’s evolution reflects key connections between food and morality, alongside developments in food safety, regulation, media and advertising, and politics. The definition of natural food at any given moment illuminates much about the values, trends, and political and economic dynamics of that period.


Proteins and Meat  

Laura-Elena Keck

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article. Proteins held a special position in 19th-century nutritional debates: “Discovered” in the mid-1800s, they were thought to be the source of muscle energy and seemed to offer solutions to many of the problems and challenges of the industrial age. Meat, in particular, was praised by doctors and nutrition experts as a protein-rich food that could help to combat malnutrition, shape more efficient human bodies, and enhance industrial and military performance—serving not only the needs of the individual but also the interests of the nation state. Meat consumption was also associated with normative concepts of masculinity and “civilization” and was frequently cited as a reason—and justification—for colonialism. In this new equation, meat equaled proteins equaled strength. This attitude slowly started to change in the early 1900s: A growing number of nutrition experts joined early critics—vegetarians and nutrition reformers—in advocating the use of more “efficient” protein and energy sources, while excessive meat consumption increasingly came to be identified as a risk factor for disease. Nevertheless, today we can see many of the 19th-century preoccupations with proteins, meat, “civilization,” and masculinity lingering or re-emerging in dietary trends like the “paleo diet,” protein-enriched “functional foods,” or books written by vegan bodybuilders. Transcending nutritional debates, these phenomena are symptomatic of broader attitudes toward eating, health, society, and the human body.


Spices in the Ancient World  

Matthew Adam Cobb

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] The movement and consumption of spices and aromatics have been features of human history for many millennia. They have been found in contexts as diverse as early Iron Age Phoenician flasks, containing traces of cinnamon, to black peppercorns inserted into the nose of the mummified Ramses II. Traditionally, these plant (and sometimes animal and mineral) products have been viewed as the preserve of the elite, at least in the Mediterranean world and parts of Europe, where many of them do not naturally grow. However, by the early centuries ce, thanks to a growing web of connections spanning Afro-Eurasia, especially via the Indian Ocean, a much wider range of peoples got a chance to experience spices. This impacted on everything from how their food tasted and smelled to the performance of religious rituals. Advances in archaeobotany and the archaeological sciences are allowing us to build a more complex picture of the contexts in which spice consumption took place, the connected social paraphernalia that were associated with this, and the diversity of people involved. Moreover, these methods and bodies of data are contributing to our identification of the spices and aromatics that were being consumed, adding more detail to the sometimes-hazy picture provided by ancient authors.


The Tasting Menu  

Alison Pearlman

[This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Food Studies. Please check back later for the full article.] A tasting menu is a set multi-course meal that a restaurant offers for a single price (also called prix fixe). Diners must surrender their choice in what they eat and in how many courses. Restaurants throughout the world, at various price points, with casual or formal service, in lengthy or concise lists, offer tasting menus. But this pervasiveness and variety did not always exist. The tasting menu underwent internationalization and diversification. The form originated among an ambitious generation of French fine-dining chefs, the instigators of nouvelle cuisine. While visiting Japan in the 1960s, they encountered the centuries-old tradition of formal kaiseki meals. They subsequently adapted the format, calling the result a menu dégustation, or tasting menu. It borrowed the extraordinary artfulness and the set, multi-course structure of kaiseki yet emphasized radical departure from tradition beyond the improvisation and invention kaiseki allowed. The international fame and influence of the nouvelle chefs, assisted by a growth spurt in culinary media, established the tasting menu as the ultimate proving ground for the chef as a creative individual. Subsequent changes in the styles and locations of tasting menus derive primarily from shifts in how chefs achieve professional distinction, the dynamics of culinary media, the growth of gastronomic consumption, and fashions in aesthetics and ethics.