Kathleen A. Brosnan and Jacob Blackwell
Throughout history, food needs bonded humans to nature. The transition to agriculture constituted slow, but revolutionary ecological transformations. After 1500
Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
Michael C. C. Adams
On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history.
Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.
The first forty years of cinema in the United States, from the development and commercialization of modern motion picture technology in the mid-1890s to the full blossoming of sound-era Hollywood during the early 1930s, represents one of the most consequential periods in the history of the medium. It was a time of tremendous artistic and economic transformation, including but not limited to the storied transition from silent motion pictures to “the talkies” in the late 1920s.
Though the nomenclature of the silent era implies a relatively unified period in film history, the years before the transition to sound saw a succession of important changes in film artistry and its means of production, and film historians generally regard the epoch as divided into at least three separate and largely distinct temporalities. During the period of early cinema, which lasted about a decade from the medium’s emergence in the mid-1890s through the middle years of the new century’s first decade, motion pictures existed primarily as a novelty amusement presented in vaudeville theatres and carnival fairgrounds. Film historians Tom Gunning and André Gaudreault have famously defined the aesthetic of this period as a “cinema of attractions,” in which the technology of recording and reproducing the world, along with the new ways in which it could frame, orient, and manipulate time and space, marked the primary concerns of the medium’s artists and spectators.
A transitional period followed from around 1907 to the later 1910s when changes in the distribution model for motion pictures enabled the development of purpose-built exhibition halls and led to a marked increase in demand for the entertainment. On a formal and artistic level, the period saw a rise in the prominence of the story film and widespread experimentation with new techniques of cinematography and editing, many of which would become foundational to later cinematic style. The era also witnessed the introduction and growing prominence of feature-length filmmaking over narrative shorts. The production side was marked by intensifying competition between the original American motion picture studios based in and around New York City, several of which attempted to cement their influence by forming an oligopolistic trust, and a number of upstart “independent” West Coast studios located around Los Angeles.
Both the artistic and production trends of the transitional period came to a head during the classical era that followed, when the visual experimentation of the previous years consolidated into the “classical style” favored by the major studios, and the competition between East Coast and West Coast studios resolved definitively in favor of the latter. This was the era of Hollywood’s ascendance over domestic filmmaking in the United States and its growing influence over worldwide film markets, due in part to the decimation of the European film industry during World War I. After nearly a decade of dominance, the Hollywood studio system was so refined that the advent of marketable synchronized sound technology around 1927 produced relatively few upheavals among the coterie of top studios. Rather, the American film industry managed to reorient itself around the production of talking motion pictures so swiftly that silent film production in the United States had effectively ceased at any appreciable scale by 1929.
Artistically, the early years of “the talkies” proved challenging, as filmmakers struggled with the imperfections of early recording technology and the limitations they imposed on filmmaking practice. But filmgoing remained popular in the United States even during the depths of the Great Depression, and by the early 1930s a combination of improved technology and artistic adaptation led to such a marked increase in quality that many film historians regard the period to be the beginning of Hollywood’s Golden Era. With a new voluntary production code put in place to respond to criticism of immorality in Hollywood fare, the American film industry was poised by the early 1930s to solidify its prominent position in American cultural life.
Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films.
Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.
Helen Zoe Veit
The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease.
American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.
David L. Hostetter
American activists who challenged South African apartheid during the Cold War era extended their opposition to racial discrimination in the United States into world politics. US antiapartheid organizations worked in solidarity with forces struggling against the racist regime in South Africa and played a significant role in the global antiapartheid movement. More than four decades of organizing preceded the legislative showdown of 1986, when a bipartisan coalition in Congress overrode President Ronald Reagan’s veto, to enact economic sanctions against the apartheid regime in South Africa. Adoption of sanctions by the United States, along with transnational solidarity with the resistance to apartheid by South Africans, helped prompt the apartheid regime to relinquish power and allow the democratic elections that brought Nelson Mandela and the African National Congress to power in 1994.
Drawing on the tactics, strategies and moral authority of the civil rights movement, antiapartheid campaigners mobilized public opinion while increasing African American influence in the formulation of US foreign policy. Long-lasting organizations such as the American Committee on Africa and TransAfrica called for boycotts and divestment while lobbying for economic sanctions. Utilizing tactics such as rallies, demonstrations, and nonviolent civil disobedience actions, antiapartheid activists made their voices heard on college campuses, corporate boardrooms, municipal and state governments, as well as the halls of Congress. Cultural expressions of criticism and resistance served to reinforce public sentiment against apartheid. Novels, plays, movies, and music provided a way for Americans to connect to the struggles of those suffering under apartheid.
By extending the moral logic of the movement for African American civil rights, American anti-apartheid activists created a multicultural coalition that brought about institutional and governmental divestment from apartheid, prompted Congress to impose economic sanctions on South Africa, and increased the influence of African Americans regarding issues of race and American foreign policy.
Michael A. Krysko
Radio debuted as a wireless alternative to telegraphy in the late 19th century. At its inception, wireless technology could only transmit signals and was incapable of broadcasting actual voices. During the 1920s, however, it transformed into a medium primarily identified as one used for entertainment and informational broadcasting. The commercialization of American broadcasting, which included the establishment of national networks and reliance on advertising to generate revenue, became the so-called American system of broadcasting. This transformation demonstrates how technology is shaped by the dynamic forces of the society in which it is embedded. Broadcasting’s aural attributes also engaged listeners in a way that distinguished it from other forms of mass media. Cognitive processes triggered by the disembodied voices and sounds emanating from radio’s loudspeakers illustrate how listeners, grounded in particular social, cultural, economic, and political contexts, made sense of and understood the content with which they were engaged. Through the 1940s, difficulties in expanding the international radio presence of the United States further highlight the significance of surrounding contexts in shaping the technology and in promoting (or discouraging) listener engagement with programing content.
Utopia—the term derived from Thomas More’s 1516 volume by that name—always suggested a place that was both non-existent, a product of the imagination usually depicted fictionally as far distant in time or space, and better than the real and familiar world. In modern times, it has served as a mode of anti-capitalist critique and also, despite its supposed “unreality,” as a disposition joined to actual social movements for dramatic reform. Utopian alternatives to American capitalism, both in the sense of literary works projecting visions of ideal social relations and in real efforts to establish viable communitarian settlements, have long been a significant part of the nation’s cultural and political history. In the 1840s, American followers of the French “utopian socialist” Charles Fourier established dozens of communities based at least in part on Fourier’s principles, and those principles filtered down to the world’s most influential modern utopian novel, Edward Bellamy’s Looking Backward of 1888. Utopian community-building and the writing of anti-capitalist utopian texts surged and declined in successive waves from the 19th to the 21st century, and while the recent surges have never equaled the impact borne by Fourierism or Bellamy, the appeal of the utopian imagination has again surfaced, since the Great Recession of 2008 provoked new doubts about the viability or justice of capitalist economic and social relations.
Mark S. Massa S. J.
Historian John Higham once referred to anti-Catholicism as “by far the oldest, and the most powerful of anti-foreign traditions” in North American intellectual and cultural history. But Higham’s famous observation actually elided three different types of anti-Catholic nativism that have enjoyed a long and quite vibrant life in North America: a cultural distrust of Catholics, based on an understanding of North American public culture rooted in a profoundly British and Protestant ordering of human society; an intellectual distrust of Catholics, based on a set of epistemological and philosophical ideas first elucidated in the English (Lockean) and Scottish (“Common Sense Realist”) Enlightenments and the British Whig tradition of political thought; and a nativist distrust of Catholics as deviant members of American society, a perception central to the Protestant mainstream’s duty of “boundary maintenance” (to utilize Emile Durkheim’s reading of how “outsiders” help “insiders” maintain social control).
An examination of the long history of anti-Catholicism in the United States can be divided into three parts: first, an overview of the types of anti-Catholic animus utilizing the typology adumbrated above; second, a narrative history of the most important anti-Catholic events in U.S. culture (e.g., Harvard’s Dudleian Lectures, the Suffolk Resolves, the burning of the Charlestown convent, Maria Monk’s Awful Disclosures); and finally, a discussion of American Catholic efforts to address the animus.
The Eaton sisters, Edith Maude (b. 1865–d. 1914) and Winnifred (b. 1875–d. 1954), were biracial authors who wrote under their respective pseudonyms, Sui Sin Far and Onoto Watanna. Raised in Montreal, Canada, by an English father and a Chinese mother, the sisters produced works that many scholars have recognized as among the first published by Asian American writers. Edith embraced her Chinese ancestry by composing newspaper articles and short stories that addressed the plight of Chinese immigrants in North America. Winnifred, on the other hand, posed as a Japanese woman and eclipsed her older sibling in popularity by writing interracial romances set in Japan.
The significance of the Eaton sisters emerges from a distinct moment in American history. At the turn of the 20th century, the United States began asserting an imperial presence in Asia and the Caribbean, while waves of immigrants entered the nation as valued industrial labor. This dual movement of overseas expansion and incoming foreign populations gave rise to a sense of superiority and anxiety within the white American mainstream. Even as U.S. statesmen and missionaries sought to extend democracy, Christianity, and trade relations abroad, they also doubted that people who came to America could assimilate themselves according to the tenets of a liberal white Protestantism. This concern became evident with the passage of the Chinese Exclusion Act (1882) and the Gentleman’s Agreement (1907), legislation that thwarted Chinese and Japanese immigration efforts. The lives and writings of the Eaton sisters intersected with these broader developments. As mixed-race authors, they catered to a growing U.S. consumer interest in things Asian, acting as cultural interpreters between East and West. In doing so, however, they complicated and challenged American beliefs and attitudes about race relations, gender roles, and empire building.
Shelley Sang-Hee Lee
Although the 1992 Los Angeles riots have been described as a “race riot” sparked by the acquittals of a group of mostly white police officers charged with excessively beating black motorist Rodney King, the widespread targeting and destruction of Asian-owned (mainly Korean) property in and around South Central Los Angeles stands out as one of the most striking aspects of the uprising. For all the commentary generated about the state of black-white relations, African American youths, and the decline of America’s inner cities, the riots also gave many Americans their first awareness of the presence of a Korean immigrant population in Southern California, a large number of Korean shop owners, and the existence of what was commonly framed as the “black-Korean conflict.” For Korean Americans, and Asian Americans more generally, the Los Angeles riots represented a shattered “American dream” and brought focus to their tenuous hold on economic mobility and social inclusion in a society fraught by racial and ethnic tension. The riots furthermore marked a turning point that placed Asian immigrants and Asian Americans at the center of new conversations about social relations in a multiracial America, the place of new immigrants, and the responsibilities of relatively privileged minorities toward the less privileged.
Although Americans have adopted and continue to adopt children from all over the world, Asian minors have immigrated and joined American families in the greatest numbers and most shaped our collective understanding of the process and experiences of adoption. The movement and integration of infants and youths from Japan, the Philippines, India, Vietnam, Korea, and China (the most common sending nations in the region) since the 1940s have not only altered the composition and conception of the American family but also reflected and reinforced the complexities of U.S. relations with and actions in Asia. In tracing the history of Asian international adoption, we can undercover shifting ideas of race and national belonging. The subject enriches the fields of Asian American and immigration history.
Maxine Leeds Craig
Black beauty culture developed in the context of widespread disparagement of black men and women in images produced by whites, and black women’s exclusion from mainstream cultural institutions, such as beauty contests, which defined beauty standards on a national scale. Though mainstream media rarely represented black women as beautiful, black women’s beauty was valued within black communities. Moreover many black women used cosmetics, hair products and styling, and clothing to meet their communities’ standards for feminine appearance. At the beginning of the 20th century, the black press, which included newspapers, general magazines, and women’s magazines, showcased the beauty of black women. As early as the 1890s, black communities organized beauty contests that celebrated black women’s beauty and served as fora for debating definitions of black beauty. Still, generally, but not always, the black press and black women’s beauty pageants favored women with lighter skin tones, and many cosmetics firms that marketed to black women sold skin lighteners. The favoring of light skin was nonetheless debated and contested within black communities, especially during periods of heightened black political activism. In the 1910s and 1920s and later in the 1960s and 1970s, social movements fostered critiques of black aesthetics and beauty practices deemed Eurocentric. One focus of criticism was the widespread black practice of hair straightening—a critique that has produced an enduring association between hairstyles perceived as natural and racial pride. In the last decades of the 20th century and the beginning of the 21st, African migration and the transnational dissemination of information via the internet contributed to a creative proliferation of African American hairstyles. While such styles display hair textures associated with African American hair, and are celebrated as natural hairstyles, they generally require the use of hair products and may incorporate synthetic hair extensions.
Beauty culture provided an important vehicle for African American entrepreneurship at a time when racial discrimination barred black women from other opportunities and most national cosmetics companies ignored black women. Black women’s beauty-culture business activities included beauticians who provided hair care in home settings and the extremely successful nationwide and international brand of hair- and skin-care products developed in the first two decades of the 20th century by Madam C. J. Walker. Hair-care shops provided important places for sharing information and community organizing. By the end of the 20th century, a few black-owned hair-care and cosmetics companies achieved broad markets and substantial profitability, but most declined or disappeared as they faced increased competition from or were purchased by larger white-owned corporations.
Buddhist history in the United States traces to the mid-19th century, when early scholars and spiritual pioneers first introduced the subject to Americans, followed soon by the arrival of Chinese immigrants to the West Coast. Interest in Buddhism was significant during the late Victorian era, but practice was almost completely confined to Asian immigrants, who faced severe white prejudice and legal discrimination. The Japanese were the first to establish robust, long-lasting temple networks, though they, too, faced persecution, culminating in the 1942 incarceration of 120,000 Japanese Americans, a severe blow to American Buddhism. Outside the Japanese American community, Buddhism grew slowly in the earlier decades of the 20th century, but it began to take off in the 1960s, aided soon by the lifting of onerous immigration laws and the return of large-scale Asian immigration. By the end of the 20th century American Buddhism had become extremely diverse and complex, with clear evidence of permanence in Asian American and other communities.
The history of Calvinism in the United States is part of a much larger development, the globalization of western Christianity. American Calvinism owes its existence to the transplanting of European churches and religious institutions to North America, a process that began in the 16th century, first with Spanish and French Roman Catholics, and accelerated a century later when Dutch, English, Scottish, and German colonists and immigrants of diverse Protestant backgrounds settled in the New World. The initial variety of Calvinists in North America was the result of the different circumstances under which Protestantism emerged in Europe as a rival to the Roman Catholic Church, to the diverse civil governments that supported established Protestant churches, and to the various business sponsors that included the Christian ministry as part of imperial or colonial designs.
Once the British dominated the Eastern seaboard (roughly 1675), and after English colonists successfully fought for political independence (1783), Calvinism lost its variety. Beyond their separate denominations, English-speaking Protestants (whether English, Scottish, or Irish) created a plethora of interdenominational religious agencies for the purpose of establishing a Christian presence in an expanding American society. For these Calvinists, being Protestant went hand in hand with loyalty to the United States. Outside this pan-Protestant network of Anglo-American churches and religious institutions were ethnic-based Calvinist denominations caught between Old World ways of being Christian and American patterns of religious life. Over time, most Calvinist groups adapted to national norms, while some retained institutional autonomy for fear of compromising their faith.
Since 1970, when the United States entered an era sometimes called post-Protestant, Calvinist churches and institutions have either declined or become stagnant. But in certain academic, literary, and popular culture settings, Calvinism has for some Americans, whether connected or not to Calvinist churches, continued to be a source for sober reflection on human existence and earnest belief and religious practice.
The Catawba Indian Nation of the 1750s developed from the integration of diverse Piedmont Indian people who belonged to and lived in autonomous communities along the Catawba River of North and South Carolina. Catawban-speaking Piedmont Indians experienced many processes of coalescence, where thinly populated groups joined the militarily strong Iswą Indians (Catawba proper) for protection and survival. Over twenty-five groups of Indians merged with the Iswą, creating an alliance or confederation of tribal communities. They all worked together building a unified community through kinship, traditional customs, and a shared history to form a nation, despite the effects of colonialism, which included European settlement, Indian slavery, warfare, disease, land loss, and federal termination. American settler colonialism, therefore, functions to erase and exterminate Native societies through biological warfare (intentional or not), military might, seizure of Native land, and assimilation. In spite of these challenges, the Catawbas’ nation-building efforts have been constant, but in 1960 the federal government terminated its relationship with the Nation. In the 1970s, the Catawba Indian Nation filed a suit to reclaim their land and their federal recognition status. Consequently, the Nation received federal recognition in 1993 and became the only federally recognized tribe in the state of South Carolina. The Nation has land seven miles east of the city of Rock Hill along the Catawba River. Tribal citizenship consists of 3,400 Catawbas including 2,400 citizens of voting age. The tribe holds elections every four years to fill five executive positions—Chief, Assistant Chief, Secretary/Treasurer, and two at-large positions. Scholarship on Southeastern Indians focuses less on the history of the Catawba Indian Nation and more on the historical narratives of the Five Civilized Tribes, which obscures the role Catawbas filled in the history of the development of the South. Finally, a comprehensive Catawba Nation history explains how the people became Catawba and, through persistence, ensured the survival of the Nation and its people.
The Catholic Church has been a presence in the United States since the arrival of French and Spanish missionaries in the 16th and 17th centuries. The Spanish established a number of missions in what is now the western part of the United States; the most important French colony was New Orleans. Although they were a minority in the thirteen British colonies prior to the American Revolution, Catholics found ways to participate in communal forms of worship when no priest was available to celebrate Mass. John Carroll was appointed superior of the Mission of the United States of America in 1785. Four years later, Carroll was elected the first bishop in the United States; his diocese encompassed the entire country. The Catholic population of the United States began to grow during the first half of the 19th century primarily due to Irish and German immigration. Protestant America was often critical of the newcomers, believing one could not be a good Catholic and a good American at the same time. By 1850, Roman Catholicism was the largest denomination in the United States.
The number of Catholics arriving in the United States declined during the Civil War but began to increase after the cessation of hostilities. Catholic immigrants during the late 19th and early 20th centuries were primarily from southern and Eastern Europe, and they were not often welcomed by a church that was dominated by Irish and Irish American leaders. At the same time that the church was expanding its network of parishes, schools, and hospitals to meet the physical and spiritual needs of the new immigrants, other Catholics were determining how their church could speak to issues of social and economic justice. Dorothy Day, Father Charles Coughlin, and Monsignor John A. Ryan are three examples of practicing Catholics who believed that the principles of Catholicism could help to solve problems related to international relations, poverty, nuclear weapons, and the struggle between labor and capital.
In addition to changes resulting from suburbanization, the Second Vatican Council transformed Catholicism in the United States. Catholics experienced other changes as a decrease in the number of men and women entering religious life led to fewer priests and sisters staffing parochial schools and parishes. In the early decades of the 21st century, the church in the United States was trying to recover from the sexual abuse crisis. Visiting America in 2015, Pope Francis reminded Catholics of the important teachings of the church regarding poverty, justice, and climate change. It remains to be seen what impact his papacy will have on the future of Catholicism in the United States.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.