The term “Adventist” embraces a cluster of fifteen or so Protestant religious communities in America for whom the central conviction in their belief system is the idea of the imminence of the Parousia, meaning the premillennial, personal, literal return of Christ to earth and the end of the evil world. The suffix, -ist, compounded with Advent, first used in the late 1830s to describe the sharp sense of imminence that motivated the movement, soon came to be used as the common term to identify those who held the conviction. By 1844 they were called Adventists. Almost all American Adventist churches are native to the United States and arose from the Millerite movement of 1831–1844 that marked the end of the Second Great Awakening (1790–1850). The churches it birthed were shaped by the cultural and religious impulses of the period. Most of the Adventist churches remain small, with an aggregate membership within the United States approaching 3 million. Worldwide Adventist memberships, however, exceed 40 million and represent a significant export of American religious beliefs and values. The two largest churches in the group, Jehovah’s Witnesses and the Seventh-day Adventist church, have both established a substantial presence in America, each reporting domestic memberships of approximately 1.25 million in 2018, placing both within the largest twenty-five denominations in the nation. Both religions have also developed significant international branches, organically and hierarchically linked to their American headquarters. Almost 9 million international members are reported by Jehovah’s Witnesses and 21 million for Seventh-day Adventists. Conservative growth projections for Seventh-day Adventists envisage that the number of worldwide adherents may approach 100 million by the mid-to-late 21st century. Both churches have impacted church–state relations in the United States through the legal system, seeking protection of civil rights and religious liberty. Seventh-day Adventism also has influenced American culture through its emphasis on health and education.
1-20 of 579 Results
- American History x
Adventism in America
Paul D. Miller
Afghanistan has twice been thrust front and center of US national security concerns in the past half-century: first, during the Soviet-Afghan War, when Afghanistan served as a proxy for American efforts to combat Soviet influence; and second, as the frontline state and host for America’s global response to al-Qaida’s terrorist attacks of 2001. In both instances, American involvement swung from intensive investment and engagement to withdrawal and neglect. In both cases, American involvement reflected US concerns more than Afghan realities. And both episodes resulted in short-term successes for American security with long-term consequences for Afghanistan and its people. The signing of a strategic partnership agreement between the two countries in 2012 and a bilateral security agreement in 2013 created the possibility of a steadier and more forward-looking relationship—albeit one that the American and Afghan people may be less inclined to pursue as America’s longest war continues to grind on.
The African American Denominational Tradition: The Rise of Black Denominations
Dennis C. Dickerson
From the formation of the first independent African American Protestant denomination in the 1810s and 1820s to the opening decades of the 21st century, independent African American denominations have stood at the center of black religious life in the United States. Their longevity and influence have made them central to the preservation of black beliefs, practices, and rituals; have provided venues to promote movements for black freedom; and have incubated African American leadership in both the church and civic spheres. They have intertwined with every aspect of American and African American life, whether cultural, political, or economic, and they engaged the international involvement of American society and the diasporic interests of black people. Parallel assemblies composed of black ministers pastoring black congregations that remained within white denominations also emerged within the traditional white denominations, including the white Episcopal, Presbyterian, and Congregational Protestant groups, plus the Catholic Church. Although they eschewed withdrawing from the white denominations, their extramural bodies functioned as a virtual black ecclesia, or institutional bodies, even though they remained smaller than the growing independent black denominations. Together, the black preachers and parishioners in independent black denominations and inside traditional white denominations maintained churches characterized by proud histories and long records of frontline involvements in black freedom pursuits.
African American Radio
Since its debut in the 1920s, African American radio has remained a permanent fixture in American popular culture. In the early years of radio, networks began to broadcast limited radio programming dedicated to showcasing “black” characters. Although these broadcasts were partially geared toward the black community, almost all of the featured performers were white actors who caricatured black culture and African American speech. In response to the negative black imagery presented in early radio, African American broadcasters sought to counter this problematic representation with programming produced and performed by black entertainers, who evoked cultural pride for the black community. The black community’s commitment to positively transforming African American presence in radio, led to a continuous evolution of this important medium. Such an evolution included the presentation and celebration of black entertainment though music and talk radio, the rise of “black-appeal” radio stations, which supported causes related to African American civil rights and cultural pride, the exposure of African American music to interracial audiences, and the emergence of African American disc jockeys as cultural heroes and community leaders. Significantly, African American radio’s transformation produced an increase in black female broadcasters and African American radio station owners.
African Americans in the Great Depression and New Deal
Mary-Elizabeth B. Murphy
For African Americans, the Great Depression and the New Deal (1929–1940) marked a transformative era and laid the groundwork for the postwar black freedom struggle in the United States. The outbreak of the Great Depression in 1929 caused widespread suffering and despair in black communities across the country as women and men faced staggering rates of unemployment and poverty. Once Franklin Delano Roosevelt (FDR), a Democrat, was inaugurated as president in 1933, he launched a “New Deal” of ambitious government programs to lift the United States out of the economic crisis. Most African Americans were skeptical about benefiting from the New Deal, and racial discrimination remained rampant. However, a cohort of black advisors and activists critiqued these government programs for excluding African Americans and enacted some reforms. At the grassroots level, black workers pressed for expanded employment opportunities and joined new labor unions to fight for economic rights. As the New Deal progressed a sea change swept over black politics. Many black voters switched their allegiance from the Republican to the Democratic Party, waged more militant campaigns for racial justice, and joined interracial and leftist coalitions. African Americans also challenged entrenched cultural stereotypes through photography, theater, and oral histories to illuminate the realities of black life in the United States. By 1940, African Americans now wielded an arsenal of protest tactics and were marching on a path toward full citizenship rights, which remains an always evolving process.
African American Soldiers in World War I
Amanda M. Nagel
In the midst of the long black freedom struggle, African American military participation in the First World War remains central to civil rights activism and challenges to systems of oppression in the United States. As part of a long and storied tradition of military service for a nation that marginalized and attempted to subjugate a significant portion of US citizens, African American soldiers faced challenges, racism, and segregation during the First World War simultaneously on the home front and the battlefields of France. The generations born since the end of the Civil War continually became more and more militant when resisting Jim Crow and insisting on full, not partial, citizenship in the United States, evidenced by the events in Houston in 1917. Support of the war effort within black communities in the United States was not universal, however, and some opposed participation in a war effort to “make the world safe for democracy” when that same democracy was denied to people of color. Activism by organizations like the National Association for the Advancement of Colored People (NAACP) challenged the War Department’s official and unofficial policy, creating avenues for a larger number of black officers in the US Army through the officers’ training camp created in Des Moines, Iowa. For African American soldiers sent to France with the American Expeditionary Forces (AEF), the potential for combat experience led to both failures and successes, leading to race pride as in the case of the 93rd Division’s successes, and skewed evidence for the War Department to reject increasing the number of black officers and enlisted in the case of the 92nd Division. All-black Regular Army regiments, meanwhile, either remained in the United States or were sent to the Philippines rather than the battlefields of Europe. However, soldiers’ return home was mixed, as they were both celebrated and rejected for their service, reflected in both parades welcoming them home and racial violence in the form of lynchings between December 1918 and January 1920. As a result, the interwar years and the start of World War II roughly two decades later renewed the desire to utilize military service as a way to influence US legal, social, cultural, and economic structures that limited African American citizenship.
African American Women and Feminism in the 19th Century
What in contemporary parlance we would call African American feminisms has been a politics and activism communal in its orientation, addressing the rights and material conditions of women, men, and children since the first Dutch slaver brought captive Africans to Jamestown, Virginia in 1619. Although Black women would not have used the terms “feminist” or “feminism,” which did not enter into use until what is recognized now as the first wave of feminism, scholars have been using those terms for the past two decades to refer to Black women’s activism in the United States stretching at least as far back as the 1830s with the oratory and publications of Maria Stewart and the work of African American women in abolition and church reform. Alongside and in many ways enabled by crucial forms of resistance to slavery, Black women developed forms of feminist activism and a political culture that advanced claims for freedom and rights in a number of arenas. Yet our historical knowledge of 19th-century Black feminist activism has been limited by historiographical tendencies. Histories of American feminism have tended to marginalize Black feminisms by positioning these activists as contributing to a white-dominant narrative, focused on woman’s rights and suffrage. The literature on African American feminism has tended to hail the Black women’s club movement of the late 19th century as the emergence of that politics. Though many people may recognize only a handful of 19th-century African American feminists by name and reputation, early Black feminism was multiply located and extensive in its work. African American women continued the voluntary work that benevolent and mutual aid societies had begun in the late 18th century and established literary societies during the early 19th century; they entered Black nationalist debates over emigration and advocated for the self-sufficiency and education of their communities, including women; and they fought to end slavery and the repressive racialized violence that accompanied it in free states and continued through the nadir. Throughout the century, African American feminists negotiated competing and often conflicting demands within interracial reform movements like abolition, woman’s rights, and temperance, and worked to open the pulpit, platform, press, and politics to Black women’s voices.
Agriculture and Food Aid in US Policymaking during the Cold War
Kristin L. Ahlberg
In the 20th century, US policymakers often attempted to solve domestic agricultural oversupply problems by extending food aid to foreign recipients. In some instances, the United States donated food in times of natural disasters. In other instances, the United States offered commodities to induce foreign governments to support US foreign policy aims or to spur agricultural modernization. These efforts coalesced during the 1950s with the enactment of Public Law 480, commonly known as the Food for Peace program, which provided for a formal, bureaucratic mechanism for the disbursement of commodities. Throughout the second half of the 20th century, successive presidential administrations continued to deploy commodities in advance of their often disparate foreign policy objectives.
Agriculture and Rural Life in the South, 1900–1945
William Thomas Okie
The period from 1900 to 1945 was characterized by both surprising continuity and dramatic change in southern agriculture. Unlike the rest of the nation, which urbanized and industrialized at a rapid pace in the late nineteenth century, the South remained overwhelmingly rural and poor, from the 1880s through the 1930s. But by 1945, the region was beginning to urbanize and industrialize into a recognizably modern South, with a population concentrated in urban centers, industries taking hold, and agriculture following the larger-scale, mechanized trend common in other farming regions of the country. Three overlapping factors explain this long lag followed by rapid transformation. First, the cumulative effects of two centuries of land-extensive, staple crop agriculture and white supremacy had sapped the region of much of its fertility and limited its options for prosperity. Second, in response to this “problem South,” generations of reformers sought to modernize the South, along with other rural areas around the world. These piecemeal efforts became the foundation for the South’s dramatic transformation by federal policy known as the New Deal. Third, poor rural southerners, both black and white, left the countryside in increasing numbers. Coupled with the labor demands created by two major military conflicts, World War I and World War II, this movement aided and abetted the mechanization of agriculture and the depopulation of the rural South.
Agriculture and the Environment
During the Holocene, the present geological epoch, an increasing portion of humans began to manipulate the reproduction of plants and animals in a series of environmental practices known as agriculture. No other ecological relationship sustains as many humans as farming; no other has transformed the landscape to the same extent. The domestication of plants by American Indians followed the end of the last glacial maximum (the Ice Age). About eight thousand years ago, the first domesticated maize and squash arrived from central Mexico, spreading to every region and as far north as the subarctic boreal forest. The incursion of Europeans into North America set off widespread deforestation, soil depletion, and the spread of settlement, followed by the introduction of industrial machines and chemicals. A series of institutions sponsored publically funded research into fertilizers and insecticides. By the late 19th century, writers and activists criticized the technological transformation of farming as destructive to the environment and rural society. During the 20th century, wind erosion contributed to the depopulation of much of the Great Plains. Vast projects in environmental engineering transformed deserts into highly productive regions of intensive fruit and vegetable production. Throughout much of the 19th and 20th centuries, access to land remained limited to whites, with American Indians, African Americans, Latinas/os, Chinese, and peoples of other ethnicities attempting to gain farms or hold on to the land they owned. Two broad periods describe the history of agriculture and the environment in that portion of North America that became the United States. In the first, the environment dominated, forcing humans to adapt during the end of thousands of years of extreme climate variability. In the second, institutional and technological change became more significant, though the environment remained a constant factor against which American agriculture took shape. A related historical pattern within this shift was the capitalist transformation of the United States. For thousands of years, households sustained themselves and exchanged some of what they produced for money. But during the 19th century among a majority of American farmers, commodities took over the entire purpose of agriculture, transforming environments to reflect commercial opportunity.
Agriculture, Food, and the Environment
Kathleen A. Brosnan and Jacob Blackwell
Throughout history, food needs bonded humans to nature. The transition to agriculture constituted slow, but revolutionary ecological transformations. After 1500 ce, agricultural goods, as well as pests that undermined them, dominated the exchange of species between four continents. In the United States, increasingly more commercial efforts simplified ecosystems. Improved technologies and market mechanisms facilitated surpluses in the 19th century that fueled industrialization and urbanization. In the 20th century, industrial agriculture involved expensive machinery and chemical pesticides and fertilizers in pursuit of higher outputs and profits, while consumers’ relations with their food sources and nature became attenuated.
The American Antinuclear Movement
Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
American Environmental Diplomacy
From its inception as a nation in 1789, the United States has engaged in an environmental diplomacy that has included attempts to gain control of resources, as well as formal diplomatic efforts to regulate the use of resources shared with other nations and peoples. American environmental diplomacy has sought to gain control of natural resources, to conserve those resources for the future, and to protect environmental amenities from destruction. As an acquirer of natural resources, the United States has focused on arable land as well as on ocean fisheries, although around 1900, the focus on ocean fisheries turned into a desire to conserve marine resources from unregulated harvesting. The main 20th-century U.S. goal was to extend beyond its borders its Progressive-era desire to utilize resources efficiently, meaning the greatest good for the greatest number for the longest time. For most of the 20th century, the United States was the leader in promoting global environmental protection through the best science, especially emphasizing wildlife. Near the end of the century, U.S. government science policy was increasingly out of step with global environmental thinking, and the United States often found itself on the outside. Most notably, the attempts to address climate change moved ahead with almost every country in the world except the United States. While a few monographs focus squarely on environmental diplomacy, it is safe to say that historians have not come close to tapping the potential of the intersection of the environmental and diplomatic history of the United States.
American Environmental Policy Since 1964
Richard N. L. Andrews
Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources. These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.” From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them. With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.
The American Experience during World War II
Michael C. C. Adams
On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history. Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.
American Film from the Silent Era to the “Talkies”
The first forty years of cinema in the United States, from the development and commercialization of modern motion picture technology in the mid-1890s to the full blossoming of sound-era Hollywood during the early 1930s, represents one of the most consequential periods in the history of the medium. It was a time of tremendous artistic and economic transformation, including but not limited to the storied transition from silent motion pictures to “the talkies” in the late 1920s. Though the nomenclature of the silent era implies a relatively unified period in film history, the years before the transition to sound saw a succession of important changes in film artistry and its means of production, and film historians generally regard the epoch as divided into at least three separate and largely distinct temporalities. During the period of early cinema, which lasted about a decade from the medium’s emergence in the mid-1890s through the middle years of the new century’s first decade, motion pictures existed primarily as a novelty amusement presented in vaudeville theatres and carnival fairgrounds. Film historians Tom Gunning and André Gaudreault have famously defined the aesthetic of this period as a “cinema of attractions,” in which the technology of recording and reproducing the world, along with the new ways in which it could frame, orient, and manipulate time and space, marked the primary concerns of the medium’s artists and spectators. A transitional period followed from around 1907 to the later 1910s when changes in the distribution model for motion pictures enabled the development of purpose-built exhibition halls and led to a marked increase in demand for the entertainment. On a formal and artistic level, the period saw a rise in the prominence of the story film and widespread experimentation with new techniques of cinematography and editing, many of which would become foundational to later cinematic style. The era also witnessed the introduction and growing prominence of feature-length filmmaking over narrative shorts. The production side was marked by intensifying competition between the original American motion picture studios based in and around New York City, several of which attempted to cement their influence by forming an oligopolistic trust, and a number of upstart “independent” West Coast studios located around Los Angeles. Both the artistic and production trends of the transitional period came to a head during the classical era that followed, when the visual experimentation of the previous years consolidated into the “classical style” favored by the major studios, and the competition between East Coast and West Coast studios resolved definitively in favor of the latter. This was the era of Hollywood’s ascendance over domestic filmmaking in the United States and its growing influence over worldwide film markets, due in part to the decimation of the European film industry during World War I. After nearly a decade of dominance, the Hollywood studio system was so refined that the advent of marketable synchronized sound technology around 1927 produced relatively few upheavals among the coterie of top studios. Rather, the American film industry managed to reorient itself around the production of talking motion pictures so swiftly that silent film production in the United States had effectively ceased at any appreciable scale by 1929. Artistically, the early years of “the talkies” proved challenging, as filmmakers struggled with the imperfections of early recording technology and the limitations they imposed on filmmaking practice. But filmgoing remained popular in the United States even during the depths of the Great Depression, and by the early 1930s a combination of improved technology and artistic adaptation led to such a marked increase in quality that many film historians regard the period to be the beginning of Hollywood’s Golden Era. With a new voluntary production code put in place to respond to criticism of immorality in Hollywood fare, the American film industry was poised by the early 1930s to solidify its prominent position in American cultural life.
American Film since 1945
Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films. Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.
American Food, Cooking, and Nutrition, 1900–1945
Helen Zoe Veit
The first half of the 20th century saw extraordinary changes in the ways Americans produced, procured, cooked, and ate food. Exploding food production easily outstripped population growth in this era as intensive plant and animal breeding, the booming use of synthetic fertilizers and pesticides, and technological advances in farm equipment all resulted in dramatically greater yields on American farms. At the same time, a rapidly growing transportation network of refrigerated ships, railroads, and trucks hugely expanded the reach of different food crops and increased the variety of foods consumers across the country could buy, even as food imports from other countries soared. Meanwhile, new technologies, such as mechanical refrigeration, reliable industrial canning, and, by the end of the era, frozen foods, subtly encouraged Americans to eat less locally and seasonally than ever before. Yet as American food became more abundant and more affordable, diminishing want and suffering, it also contributed to new problems, especially rising body weights and mounting rates of cardiac disease. American taste preferences themselves changed throughout the era as more people came to expect stronger flavors, grew accustomed to the taste of industrially processed foods, and sampled so-called “foreign” foods, which played an enormous role in defining 20th-century American cuisine. Food marketing exploded, and food companies invested ever greater sums in print and radio advertising and eye-catching packaging. At home, a range of appliances made cooking easier, and modern grocery stores and increasing car ownership made it possible for Americans to food shop less frequently. Home economics provided Americans, especially girls and women, with newly scientific and managerial approaches to cooking and home management, and Americans as a whole increasingly approached food through the lens of science. Virtually all areas related to food saw fundamental shifts in the first half of the 20th century, from agriculture to industrial processing, from nutrition science to weight-loss culture, from marketing to transportation, and from kitchen technology to cuisine. Not everything about food changed in this era, but the rapid pace of change probably exaggerated the transformations for the many Americans who experienced them.
American Indian Activism after 1945
American Indian activism after 1945 was as much a part of the larger, global decolonization movement rooted in centuries of imperialism as it was a direct response to the ethos of civic nationalism and integration that had gained momentum in the United States following World War II. This ethos manifested itself in the disastrous federal policies of termination and relocation, which sought to end federal services to recognized Indian tribes and encourage Native people to leave reservations for cities. In response, tribal leaders from throughout Indian Country formed the National Congress of American Indians (NCAI) in 1944 to litigate and lobby for the collective well-being of Native peoples. The NCAI was the first intertribal organization to embrace the concepts of sovereignty, treaty rights, and cultural preservation—principles that continue to guide Native activists today. As American Indian activism grew increasingly militant in the late 1960s and 1970s, civil disobedience, demonstrations, and takeovers became the preferred tactics of “Red Power” organizations such as the National Indian Youth Council (NIYC), the Indians of All Tribes, and the American Indian Movement (AIM). At the same time, others established more focused efforts that employed less confrontational methods. For example, the Native American Rights Fund (NARF) served as a legal apparatus that represented Native nations, using the courts to protect treaty rights and expand sovereignty; the Council of Energy Resource Tribes (CERT) sought to secure greater returns on the mineral wealth found on tribal lands; and the American Indian Higher Education Consortium (AIHEC) brought Native educators together to work for greater self-determination and culturally rooted curricula in Indian schools. While the more militant of these organizations and efforts have withered, those that have exploited established channels have grown and flourished. Such efforts will no doubt continue into the unforeseeable future so long as the state of Native nations remains uncertain.
American Labor and the Working Day
From the first local strikes in the late 18th century to the massive eight-hour movement that shook the country a century later, the length of the working day has been one of the most contentious issues in the history of American labor. Organized workers have fought for shorter hours for various reasons. If they were to be good citizens, workers needed time to follow the news and attend political rallies, to visit lectures and museums, and to perform civic duties. Shorter-hour activists also defended worktime reduction as a tool for moral betterment. Workers needed time to attend religious services and be involved in religious associations, to become better spouses and parents, and to refine their customs and manners through exposure to literature, music, and the arts. Trade unions also promoted shorter hours as sound economic policy. Especially when joblessness was rampant, unionists argued that shorter working days would help distribute available work more evenly among the workforce. During times of economic growth, they shifted the focus to productivity and consumption, arguing that well-rested workers not only performed better, but also had the time to purchase and enjoy the products and services they helped create. As organized labor tended to give preference to full employment and consumption over further working time reductions in the aftermath of the New Deal, the hour issue took a backseat in the second half of the 20th century. It reentered the debate, however, in the late 2000s when high-tech and knowledge industries started to experiment with compressed workweek models. Given the widespread experience of remote work and temporary working time reductions during the Covid-19 pandemic, the question of how much time Americans should, must, and want to spend at work is likely to remain in the focus of public attention.