The story of mass culture from 1900 to 1945 is the story of its growth and increasing centrality to American life. Sparked by the development of such new media as radios, phonographs, and cinema that required less literacy and formal education, and the commodification of leisure pursuits, mass culture extended its purview to nearly the entire nation by the end of the Second World War. In the process, it became one way in which immigrant and second-generation Americans could learn about the United States and stake a claim to participation in civic and social life. Mass culture characteristically consisted of artifacts that stressed pleasure, sensation, and glamor rather than, as previously been the case, eternal and ethereal beauty, moral propriety, and personal transcendence. It had the power to determine acceptable values and beliefs and define qualities and characteristics of social groups. The constant and graphic stimulation led many custodians of culture to worry about the kinds of stimulation that mass culture provided and about a breakdown in social morality that would surely follow. As a result, they formed regulatory agencies and watchdogs to monitor the mass culture available on the market. Other critics charged the regime of mass culture with inducing homogenization of belief and practice and contributing to passive acceptance of the status quo. The spread of mass culture did not terminate regional, class, or racial cultures; indeed, mass culture artifacts often borrowed them. Nor did marginalized groups accept stereotypical portrayals; rather, they worked to expand the possibilities of prevailing ones and to provide alternatives.
Clay Silver Katsky
While presidents have historically been the driving force behind foreign policy decision-making, Congress has used its constitutional authority to influence the process. The nation’s founders designed a system of checks and balances aimed at establishing a degree of equilibrium in foreign affairs powers. Though the president is the commander-in-chief of the armed forces and the country’s chief diplomat, Congress holds responsibility for declaring war and can also exert influence over foreign relations through its powers over taxation and appropriation, while the Senate possesses authority to approve or reject international agreements. This separation of powers compels the executive branch to work with Congress to achieve foreign policy goals, but it also sets up conflict over what policies best serve national interests and the appropriate balance between executive and legislative authority. Since the founding of the Republic, presidential power over foreign relations has accreted in fits and starts at the legislature’s expense. When core American interests have come under threat, legislators have undermined or surrendered their power by accepting presidents’ claims that defense of national interests required strong executive action. This trend peaked during the Cold War, when invocations of national security enabled the executive to amass unprecedented control over America’s foreign affairs.
John Giggie and Emma Jackson Pepperman
Professional studies of lynching and its tragic history, especially its unique American character, depth, and dynamics, evolved in critically important ways from the pioneering scholarship of W. E. B. Du Bois and Ida B. Wells in the 1890s and 1900s across the 20th century and into the 21st century, their different stages introducing fresh categories of analysis amidst moments of dramatic civil rights protests. The first stage was heralded by pioneering research by African American intellectuals, such as Du Bois and Wells, and growing black demands for an end to discrimination in the late 19th century. Joining them in the early 20th century was a small group of social scientists whose case studies of lynching illuminated race relations in local communities or, from a very different vantage, saw them as symptoms of the violence so common in American society. The push to end racial and gender segregation and the passage of civil rights laws in the 1960s and 1970s encouraged historians to review lynchings from new perspectives, including gender, sexuality, religion, memory, and black community formation and resistance, stressing their centrality to modern southern history. The late 20th century saw a comparative turn. Historians evaluated lynching across America to identify common patterns of racial subjugation, but also to see how it was used to punish a wide range of Americans, including Asian Americans, Mexican Americans, and Native Americans. By 2000, the field shifted again, this time toward memorialization and community remembrance. Scholars and lawyers recalculated the total number of lynchings in America and found a large number of unrecorded killings, asked why so little was known about them, and created memorials to the victims. They demanded, too, that the causes and long-term consequences of the nation’s history of racial violence be discussed openly and taught in public schools. This effort is of particular resonance in 2020 as America confronts rising protests over a culture of mass incarceration and police brutality that disproportionately affects men and women of color. Indeed, the historical study of lynching has never been so vital as it is in the early 21st century.
Examining American history through the lens of black girlhood underscores just how thoroughly childhood everywhere is not “natural” but depends heavily on its social construction. Furthermore, ideas about childhood innocence are deeply racialized and gendered. At the end of Reconstruction, African Americans lost many of the social and political gains achieved after the Civil War. This signaled the emergence of Jim Crow, placing many blacks in the same social, political, and economic position that they occupied before freedom. Black girls who came of age in the 20th century lived through Jim Crow, the civil rights movement, Black Power, and the rise of the New Right. Moreover, black girls in the 20th century inherited many of the same burdens that their female ancestors carried—especially labor exploitation, criminalization, and racist notions of black sexuality—which left them vulnerable to physical, emotional, and sexual violence. In short, black girls were denied the childhood protections that their white counterparts possessed. If fights for cultural representation, economic justice, equal access to education, and a more just legal system are familiar sites of black struggle, then examining black girlhood reveals much about the black freedom movement. Activists, parents, and community advocates centered black girls’ struggles within their activism. Black girls were also leaders within their own right, lending their voices, bodies, and intellect to the movement. Their self-advocacy illustrates their resistance to systemic oppression. However, resistance in the more obvious sense—letter writing, marching, and political organizing—are not the only spaces to locate black girls’ resistance. In a nation that did not consider black children as children, their pursuit of joy and pleasure can also be read as radical acts. The history of 20th-century black girlhood is simultaneously a history of exclusion, trauma, resilience, and joy.
Malinda Maynor Lowery
The Lumbee tribe of North Carolina, including approximately 55,000 enrolled members, is the largest Indian community east of the Mississippi River. Lumbee history serves as a window into the roles that Native people have played in the struggle to implement the founding principles of the United States, not just as “the First Americans,” but as members of their own nations, operating in their own communities’ interests. When we see US history through the perspectives of Native nations, we see that the United States is not only on a quest to expand rights for individuals. Surviving Native nations like the Lumbees, who have their own unique claims on this land and its ruling government, are forcing Americans to confront the ways in which their stories, their defining moments, and their founding principles are flawed and inadequate. We know the forced removals, the massacres, the protests that Native people have lodged against injustice, yet such knowledge is not sufficient to understand American history. Lumbee history provides a way to honor, and complicate, American history by focusing not just on the dispossession and injustice visited upon Native peoples, but on how and why Native survival matters. Native nations are doing the same work as the American nation—reconstituting communities, thriving, and finding a shared identity with which to achieve justice and self-determination. Since the late 19th century, Lumbee Indians have used segregation, war, and civil rights to maintain a distinct identity in the biracial South. The Lumbees’ survival as a people, a race, and a tribal nation shows that their struggle has revolved around autonomy, or the ability to govern their own affairs. They have sought local, state, and federal recognition to support that autonomy, but doing so has entangled the processes of survival with outsiders’ ideas about what constitutes a legitimate Lumbee identity. Lumbees continue to adapt to the constraints imposed on them by outsiders, strengthening their community ties through the process of adaptation itself. Lumbee people find their cohesion in the relentless fight for self-determination. Always, that struggle has mattered more than winning or losing a single battle.
The Equal Rights Amendment (ERA), designed to enshrine in the Constitution of the United States a guarantee of equal rights to women and men, has had a long and volatile history. When first introduced in Congress in 1923, three years after ratification of the woman suffrage amendment to the US Constitution, the ERA faced fierce opposition from the majority of former suffragists. These progressive women activists opposed the ERA because it threatened hard-won protective labor legislation for wage-earning women. A half century later, however, the amendment enjoyed such broad support that it was passed by the requisite two-thirds of Congress and, in 1972, sent to the states for ratification. Unexpectedly, virulent opposition emerged during the ratification process, not among progressive women this time but among conservatives, whose savvy organizing prevented ratification by a 1982 deadline. Many scholars contend that despite the failure of ratification, equal rights thinking so triumphed in the courts and legislatures by the 1990s that a “de facto ERA” was in place. Some feminists, distrustful of reversible court decisions and repealable legislation, continued to agitate for the ERA; others voiced doubt that ERA would achieve substantive equality for women. Because support for an ERA noticeably revived in the 2010s, this history remains very much in progress.