Kathleen A. Brosnan and Jacob Blackwell
Throughout history, food needs bonded humans to nature. The transition to agriculture constituted slow, but revolutionary ecological transformations. After 1500
Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
Michael C. C. Adams
On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history.
Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.
Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films.
Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.
Helen Zoe Veit
The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease.
American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.
Michael A. Krysko
Radio debuted as a wireless alternative to telegraphy in the late 19th century. At its inception, wireless technology could only transmit signals and was incapable of broadcasting actual voices. During the 1920s, however, it transformed into a medium primarily identified as one used for entertainment and informational broadcasting. The commercialization of American broadcasting, which included the establishment of national networks and reliance on advertising to generate revenue, became the so-called American system of broadcasting. This transformation demonstrates how technology is shaped by the dynamic forces of the society in which it is embedded. Broadcasting’s aural attributes also engaged listeners in a way that distinguished it from other forms of mass media. Cognitive processes triggered by the disembodied voices and sounds emanating from radio’s loudspeakers illustrate how listeners, grounded in particular social, cultural, economic, and political contexts, made sense of and understood the content with which they were engaged. Through the 1940s, difficulties in expanding the international radio presence of the United States further highlight the significance of surrounding contexts in shaping the technology and in promoting (or discouraging) listener engagement with programing content.
Mark S. Massa S. J.
Historian John Higham once referred to anti-Catholicism as “by far the oldest, and the most powerful of anti-foreign traditions” in North American intellectual and cultural history. But Higham’s famous observation actually elided three different types of anti-Catholic nativism that have enjoyed a long and quite vibrant life in North America: a cultural distrust of Catholics, based on an understanding of North American public culture rooted in a profoundly British and Protestant ordering of human society; an intellectual distrust of Catholics, based on a set of epistemological and philosophical ideas first elucidated in the English (Lockean) and Scottish (“Common Sense Realist”) Enlightenments and the British Whig tradition of political thought; and a nativist distrust of Catholics as deviant members of American society, a perception central to the Protestant mainstream’s duty of “boundary maintenance” (to utilize Emile Durkheim’s reading of how “outsiders” help “insiders” maintain social control).
An examination of the long history of anti-Catholicism in the United States can be divided into three parts: first, an overview of the types of anti-Catholic animus utilizing the typology adumbrated above; second, a narrative history of the most important anti-Catholic events in U.S. culture (e.g., Harvard’s Dudleian Lectures, the Suffolk Resolves, the burning of the Charlestown convent, Maria Monk’s Awful Disclosures); and finally, a discussion of American Catholic efforts to address the animus.
Akram Fouad Khater
Between 1880 and 1940, more than 130,000 Arabs immigrated to the United States as part of the Great Migration of the long 19th century. They lived and worked across the breadth of the United States, fought its many wars, and were engaged in the transformative debates about labor, race, gender, and citizenship that raged throughout this time period. As they struggled to carve out a place in “Amirka” they encountered and fought efforts to racialize them as the uncivilized and undesirable “Other.” Their struggles not only contributed to shaping the United States and its immigration policies, but also confronted them with the conundrum of how to belong: to accept and seek admission into the existing system delineated by race, gender, and class, or to challenge the premises of that system. While there was not a singular response from this diverse community, the majority opted to fight for a place in “white” America even if in return this rendered them a liminal ethnicity.
The Eaton sisters, Edith Maude (b. 1865–d. 1914) and Winnifred (b. 1875–d. 1954), were biracial authors who wrote under their respective pseudonyms, Sui Sin Far and Onoto Watanna. Raised in Montreal, Canada, by an English father and a Chinese mother, the sisters produced works that many scholars have recognized as among the first published by Asian American writers. Edith embraced her Chinese ancestry by composing newspaper articles and short stories that addressed the plight of Chinese immigrants in North America. Winnifred, on the other hand, posed as a Japanese woman and eclipsed her older sibling in popularity by writing interracial romances set in Japan.
The significance of the Eaton sisters emerges from a distinct moment in American history. At the turn of the 20th century, the United States began asserting an imperial presence in Asia and the Caribbean, while waves of immigrants entered the nation as valued industrial labor. This dual movement of overseas expansion and incoming foreign populations gave rise to a sense of superiority and anxiety within the white American mainstream. Even as U.S. statesmen and missionaries sought to extend democracy, Christianity, and trade relations abroad, they also doubted that people who came to America could assimilate themselves according to the tenets of a liberal white Protestantism. This concern became evident with the passage of the Chinese Exclusion Act (1882) and the Gentleman’s Agreement (1907), legislation that thwarted Chinese and Japanese immigration efforts. The lives and writings of the Eaton sisters intersected with these broader developments. As mixed-race authors, they catered to a growing U.S. consumer interest in things Asian, acting as cultural interpreters between East and West. In doing so, however, they complicated and challenged American beliefs and attitudes about race relations, gender roles, and empire building.
Shelley Sang-Hee Lee
Although the 1992 Los Angeles riots have been described as a “race riot” sparked by the acquittals of a group of mostly white police officers charged with excessively beating black motorist Rodney King, the widespread targeting and destruction of Asian-owned (mainly Korean) property in and around South Central Los Angeles stands out as one of the most striking aspects of the uprising. For all the commentary generated about the state of black-white relations, African American youths, and the decline of America’s inner cities, the riots also gave many Americans their first awareness of the presence of a Korean immigrant population in Southern California, a large number of Korean shop owners, and the existence of what was commonly framed as the “black-Korean conflict.” For Korean Americans, and Asian Americans more generally, the Los Angeles riots represented a shattered “American dream” and brought focus to their tenuous hold on economic mobility and social inclusion in a society fraught by racial and ethnic tension. The riots furthermore marked a turning point that placed Asian immigrants and Asian Americans at the center of new conversations about social relations in a multiracial America, the place of new immigrants, and the responsibilities of relatively privileged minorities toward the less privileged.
Although Americans have adopted and continue to adopt children from all over the world, Asian minors have immigrated and joined American families in the greatest numbers and most shaped our collective understanding of the process and experiences of adoption. The movement and integration of infants and youths from Japan, the Philippines, India, Vietnam, Korea, and China (the most common sending nations in the region) since the 1940s have not only altered the composition and conception of the American family but also reflected and reinforced the complexities of U.S. relations with and actions in Asia. In tracing the history of Asian international adoption, we can undercover shifting ideas of race and national belonging. The subject enriches the fields of Asian American and immigration history.
Maxine Leeds Craig
Black beauty culture developed in the context of widespread disparagement of black men and women in images produced by whites, and black women’s exclusion from mainstream cultural institutions, such as beauty contests, which defined beauty standards on a national scale. Though mainstream media rarely represented black women as beautiful, black women’s beauty was valued within black communities. Moreover many black women used cosmetics, hair products and styling, and clothing to meet their communities’ standards for feminine appearance. At the beginning of the 20th century, the black press, which included newspapers, general magazines, and women’s magazines, showcased the beauty of black women. As early as the 1890s, black communities organized beauty contests that celebrated black women’s beauty and served as fora for debating definitions of black beauty. Still, generally, but not always, the black press and black women’s beauty pageants favored women with lighter skin tones, and many cosmetics firms that marketed to black women sold skin lighteners. The favoring of light skin was nonetheless debated and contested within black communities, especially during periods of heightened black political activism. In the 1910s and 1920s and later in the 1960s and 1970s, social movements fostered critiques of black aesthetics and beauty practices deemed Eurocentric. One focus of criticism was the widespread black practice of hair straightening—a critique that has produced an enduring association between hairstyles perceived as natural and racial pride. In the last decades of the 20th century and the beginning of the 21st, African migration and the transnational dissemination of information via the internet contributed to a creative proliferation of African American hairstyles. While such styles display hair textures associated with African American hair, and are celebrated as natural hairstyles, they generally require the use of hair products and may incorporate synthetic hair extensions.
Beauty culture provided an important vehicle for African American entrepreneurship at a time when racial discrimination barred black women from other opportunities and most national cosmetics companies ignored black women. Black women’s beauty-culture business activities included beauticians who provided hair care in home settings and the extremely successful nationwide and international brand of hair- and skin-care products developed in the first two decades of the 20th century by Madam C. J. Walker. Hair-care shops provided important places for sharing information and community organizing. By the end of the 20th century, a few black-owned hair-care and cosmetics companies achieved broad markets and substantial profitability, but most declined or disappeared as they faced increased competition from or were purchased by larger white-owned corporations.
Buddhist history in the United States traces to the mid-19th century, when early scholars and spiritual pioneers first introduced the subject to Americans, followed soon by the arrival of Chinese immigrants to the West Coast. Interest in Buddhism was significant during the late Victorian era, but practice was almost completely confined to Asian immigrants, who faced severe white prejudice and legal discrimination. The Japanese were the first to establish robust, long-lasting temple networks, though they, too, faced persecution, culminating in the 1942 incarceration of 120,000 Japanese Americans, a severe blow to American Buddhism. Outside the Japanese American community, Buddhism grew slowly in the earlier decades of the 20th century, but it began to take off in the 1960s, aided soon by the lifting of onerous immigration laws and the return of large-scale Asian immigration. By the end of the 20th century American Buddhism had become extremely diverse and complex, with clear evidence of permanence in Asian American and other communities.
The history of Calvinism in the United States is part of a much larger development, the globalization of western Christianity. American Calvinism owes its existence to the transplanting of European churches and religious institutions to North America, a process that began in the 16th century, first with Spanish and French Roman Catholics, and accelerated a century later when Dutch, English, Scottish, and German colonists and immigrants of diverse Protestant backgrounds settled in the New World. The initial variety of Calvinists in North America was the result of the different circumstances under which Protestantism emerged in Europe as a rival to the Roman Catholic Church, to the diverse civil governments that supported established Protestant churches, and to the various business sponsors that included the Christian ministry as part of imperial or colonial designs.
Once the British dominated the Eastern seaboard (roughly 1675), and after English colonists successfully fought for political independence (1783), Calvinism lost its variety. Beyond their separate denominations, English-speaking Protestants (whether English, Scottish, or Irish) created a plethora of interdenominational religious agencies for the purpose of establishing a Christian presence in an expanding American society. For these Calvinists, being Protestant went hand in hand with loyalty to the United States. Outside this pan-Protestant network of Anglo-American churches and religious institutions were ethnic-based Calvinist denominations caught between Old World ways of being Christian and American patterns of religious life. Over time, most Calvinist groups adapted to national norms, while some retained institutional autonomy for fear of compromising their faith.
Since 1970, when the United States entered an era sometimes called post-Protestant, Calvinist churches and institutions have either declined or become stagnant. But in certain academic, literary, and popular culture settings, Calvinism has for some Americans, whether connected or not to Calvinist churches, continued to be a source for sober reflection on human existence and earnest belief and religious practice.
The Catholic Church has been a presence in the United States since the arrival of French and Spanish missionaries in the 16th and 17th centuries. The Spanish established a number of missions in what is now the western part of the United States; the most important French colony was New Orleans. Although they were a minority in the thirteen British colonies prior to the American Revolution, Catholics found ways to participate in communal forms of worship when no priest was available to celebrate Mass. John Carroll was appointed superior of the Mission of the United States of America in 1785. Four years later, Carroll was elected the first bishop in the United States; his diocese encompassed the entire country. The Catholic population of the United States began to grow during the first half of the 19th century primarily due to Irish and German immigration. Protestant America was often critical of the newcomers, believing one could not be a good Catholic and a good American at the same time. By 1850, Roman Catholicism was the largest denomination in the United States.
The number of Catholics arriving in the United States declined during the Civil War but began to increase after the cessation of hostilities. Catholic immigrants during the late 19th and early 20th centuries were primarily from southern and Eastern Europe, and they were not often welcomed by a church that was dominated by Irish and Irish American leaders. At the same time that the church was expanding its network of parishes, schools, and hospitals to meet the physical and spiritual needs of the new immigrants, other Catholics were determining how their church could speak to issues of social and economic justice. Dorothy Day, Father Charles Coughlin, and Monsignor John A. Ryan are three examples of practicing Catholics who believed that the principles of Catholicism could help to solve problems related to international relations, poverty, nuclear weapons, and the struggle between labor and capital.
In addition to changes resulting from suburbanization, the Second Vatican Council transformed Catholicism in the United States. Catholics experienced other changes as a decrease in the number of men and women entering religious life led to fewer priests and sisters staffing parochial schools and parishes. In the early decades of the 21st century, the church in the United States was trying to recover from the sexual abuse crisis. Visiting America in 2015, Pope Francis reminded Catholics of the important teachings of the church regarding poverty, justice, and climate change. It remains to be seen what impact his papacy will have on the future of Catholicism in the United States.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
Contagious diseases have long posed a public health challenge for cities, going back to the ancient world. Diseases traveled over trade routes from one city to another. Cities were also crowded and often dirty, ideal conditions for the transmission of infectious disease. The Europeans who settled North America quickly established cities, especially seaports, and contagious diseases soon followed. By the late 17th century, ports like Boston, New York, and Philadelphia experienced occasional epidemics, especially smallpox and yellow fever, usually introduced from incoming ships. Public health officials tried to prevent contagious diseases from entering the ports, most often by establishing a quarantine. These quarantines were occasionally effective, but more often the disease escaped into the cities. By the 18th century, city officials recognized an association between dirty cities and epidemic diseases. The appearance of a contagious disease usually occasioned a concerted effort to clean streets and remove garbage. These efforts by the early 19th century gave rise to sanitary reform to prevent infectious diseases. Sanitary reform went beyond cleaning streets and removing garbage, to ensuring clean water supplies and effective sewage removal. By the end of the century, sanitary reform had done much to clean the cities and reduce the incidence of contagious disease. In the 20th century, public health programs introduced two new tools to public health: vaccination and antibiotics. First used against smallpox, scientists developed vaccinations against numerous other infectious viral diseases and reduced their incidence substantially. Finally, the development of antibiotics against bacterial infections in the mid-20th century enabled physicians to cure infected individuals. Contagious disease remains a problem—witness AIDS—and public health authorities still rely on quarantine, sanitary reform, vaccination, and antibiotics to keep urban populations healthy.
Chloe E. Taft
The process of urban deindustrialization has been long and uneven. Even the terms “deindustrial” and “postindustrial” are contested; most cities continue to host manufacturing on some scale. After World War II, however, cities that depended on manufacturing for their lifeblood increasingly diversified their economies in the face of larger global, political, and demographic transformations. Manufacturing centers in New England, the Mid Atlantic, and the Midwest United States were soon identified as belonging to “the American Rust Belt.” Steel manufacturers, automakers, and other industrial behemoths that were once mainstays of city life closed their doors as factories and workers followed economic and social incentives to leave urban cores for the suburbs, the South, or foreign countries. Remaining industrial production became increasingly automated, resulting in significant declines in the number of factory jobs. Metropolitan officials faced with declining populations and tax bases responded by adapting their assets—in terms of workforce, location, or culture—to new economies, including warehousing and distribution, finance, health care, tourism, leisure industries like casinos, and privatized enterprises such as prisons. Faced with declining federal funding for renewal, they focused on leveraging private investment for redevelopment. Deindustrializing cities marketed themselves as destinations with convention centers, stadiums, and festival marketplaces, seeking to lure visitors and a “creative class” of new residents. While some postindustrial cities became success stories of reinvention, others struggled. They entertained options to “rightsize” by shrinking their municipal footprints, adapted vacant lots for urban agriculture, or attracted voyeurs to gaze at their industrial ruins. Whether industrial cities faced a slow transformation or the shock of multiple factory closures within a few years, the impact of these economic shifts and urban planning interventions both amplified old inequalities and created new ones.
The use of illicit drugs in US cities led to the development of important subcultures with shared practices, codes, discourses, and values. From the 19th century onward, American city dwellers have indulged in opiates, cocaine, amphetamines, cannabis, lysergic acid diethylamide (LSD), crack, and 3,4-Methylenedioxymethamphetamine (also known as MDMA or ecstasy). The population density of metropolitan America contributed to the spread of substance use and the rise of communities that centered their lives on drug consumption. In the history of urban drug use, opiates have outlasted all the other drugs and have naturally attracted the bulk of scholarly attention.
The nature and identity of these illicit subcultures usually depended on the pharmacology of the drugs and the setting in which they were used. Addictive substances like heroin and amphetamines certainly led to the rise of crime in certain urban areas, but by the same token many urban Americans managed to integrate their addiction into their everyday lives. The more complex pharmacology of psychedelic drugs like LSD in turn gave birth to rich subcultures that resist easy classifications. Most drugs began their careers as medical marvels that were accepted as the product of modernity and often used by the middle class or medical practitioners. Race, age, and class prejudice, and the association of drugs with visible subcultures perceived to pose a threat to the moral fabric of society can partly explain their subsequent bans.